var/home/core/zuul-output/0000755000175000017500000000000015155644254014540 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015155650556015506 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000227752415155650502020275 0ustar corecoreBQikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9GfͅR~i.߷;U/;Yw?.y7W޾n^/ixK|1Ool_~yyiw|zxV^֯v5gCh31 )Kh3i J1hG{aD4iӌçN/e] o;iF]u54!h/9Y@$9GAOI=2,!N{\00{B"唄(".V.U) _.f*g,Z0>?<;~9.뙘 vKAb;-$JRPţ*描Լf^`iwoW~SL2uQO)qai]>yE*,?k 9Z29}}(4ҲIFyG -^W6yY<*uvf d |TRZ;j?| |!I糓 sw`{s0Aȶ9W E%*mG:tëoG(;h0!}qfJz硂Ϧ4Ck9]٣Z%T%x~5r.N`$g`Խ!:*Wni|QXj0NbYe獸]fNdƭwq <ć;_ʧNs9[(=!@Q,}s=LN YlYd'Z;o.K'_-הp|A*Z*}QJ0SqAYE0i5P-$̿<_d^"]}Z|-5rC wjof'(%*݅^J">CMMQQ؏*ΧL ߁NPi?$;g&立q^-:}KA8Nnn6C;XHK:lL4AR0nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&VOK(BSsǽҰ%>kh5nIYk'LVc(a<1mCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfp$HhtLzܝ6rq/nLN?2Ǒ|;C@,UѩJ:|n^/GSZ;m#Nvd?PqTcLQMhg:F[bTm!V`AqPaPheUJ& z?NwpGj{VjQS,؃I'[y~EQ(S +@gX*;RgXGdCgX JgX2*Ъ3:O7ǭ3ږA :}d,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h= z{tKmdߟ9i {A.:Mw~^`X\u6|6rcIF3b9O:j 2IN…D% YCUI}~;XI썋Fqil><UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'Tស[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄc̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:Q\)#s{p'ɂN$r;fVkv߸>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`-jyL'8>JJ{>źuMp(jL!M7uTźmr(Uxbbqe5rZ HҘ3ڴ(|e@ew>w3C=9k-{p>րd^T@eFZ#WWwYzK uK r؛6V L)auS6=`#(TO֙`mn Lv%7mSU@n_Vۀl9BIcSxlT![`[klzFض˪.l >7l@ΖLl gEj gWUDnr7AG;lU6ieabp៚U|,}S@t1:X _ .xI_7ve Z@7IX/C7@u BGڔE7M/k $q^hڧ};naU%~X!^C5Aw͢.@d!@dU}b? -ʏw |VvlK۴ymkiK% 0OFjT_kPW1mk%?\@R>XCl}b ,8; :.b9m]XaINE`!6uOhUuta^xN@˭d- T5 $4ذ:[a>֋&"_ }Oõϸ~rj uw\h~M il[ 2pCaOok.X0C?~[:^Pr򣏷y@/ڠ --i!M5mjozEƨ||Yt,=d#uЇ  l]չoݴmqV".lCqBѷ /![auPmpnEjus]2{2#b'$?T3{k>h+@]*pp桸]%nĴFԨlu |VXnq#r:kg_Q1,MNi˰ 7#`VCpᇽmpM+tWuk0 q /} 5 ¶]fXEj@5JcU_b@JS`wYmJ gEk2'0/> unKs^C6B WEt7M'#|kf1:X l]ABC {kanW{ 6 g`_w\|8Fjȡstuf%Plx3E#zmxfU S^ 3_`wRY}@ŹBz²?mК/mm}m"Gy4dl\)cb<ǎk'JE%"2.*""]8yܑ4> >X1 smD) ̙TީXfnOFg㧤[Lo)[fLPBRB+x7{{? ףro_nն-2n6 Ym^]IL'M+;U t>x]U5g B(, qA9r;$IN&CM(F+ hGI~Q<웰[, qnriY]3_P${,<\V}7T g6Zapto}PhS/b&X0$Ba{a`W%ATevoYFF"4En.O8ϵq\FOXƀf qbTLhlw?8p@{]oOtsϑ`94t1!F PI;i`ޮMLX7sTGP7^s08p15w q o(uLYQB_dWoc0a#K1P,8]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟ^T;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>fXmpLJ5jRS}_D U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kre_f |Nm8p5H!jR@Aiߒ߈ۥLFTk"5l9O'ϓl5x|_®&&n]#r̥jOڧK)lsXg\{Md-% >~Ӈ/( [ycy`ðSmn_O;3=Av3LA׊onxlM?~n Θ5 ӂxzPMcVQ@ӤomY42nrQ\'"P؝J7g+#!k{paqTԫ?o?VU}aK q;T0zqaj0"2p؋9~bޏt>$AZLk;3qUlWU Ry==qrͅޕ6ql?N/e1N2i"0(HKkD4<80: M:'֥P!r "Lӓݰ@ 9n# " $fGgKQӦ4}Gn\^=-Y5PI dPN6 Ozځ/פ|5) F[ڣ$2*%&h v%9HN H~Q+oi?&۳)-nqK?2ސv/3,9ҮT9Cef˝49i.2DxatC<8iR/ƬйR֌vN8J"iJ. T>)qaY4ͬlyg "]BvW#99`TegõII kюHLa^c&/H^FFIu`2a$mc Ry+R:LڕDܓ>Y:]t.+|PT6=qWe0NƏw<6o3mv8k vGOfpEOkÈWȤMف lOc;SR&.w,qk>MPs+Xh4iyuGRd֞q鮺]m S{}]U kV0/ŜxtADx"Xh4|;XSxߵă@pE:y]/"(MCG`ʶϊGi+39#gNZYE:Qw9muB`9`LDhs4Ǩ9S`EkM{zB<˙ik; JD;;3!4 2Y.$Dwiu|+lO:k$]ԜYLUҞ6EmH>azʳ/A+ԀZk"f`.,ל{=wh|_qYj5M{K$gv>cDp"'0޽5xCNQ1G2})*'>fC۝'*)"5.E2IeD 2.ZdrN6Uœ=n8D-9޵JKw5ُJ,􋃓ZUꋼ0b1f87GՂ 1t_o}{Mr7KO0Ao-Y*Is\S:JzA(:i!eҎ\,f+,Ąt78~ڋ~?[F^.A'!,iGow3{'YToҝf5ޓ[he>=7S8DGZ@-#]f:Tm?L{F-8G#%.fM8Y='gیl0HڜHLK'Cw#)krWIk<1څ 9abHl:b3LjOq͂Ӥ=u8#E2;|z꽐vɀi^lUt␚ɓW%OVc8|*yI0U=nFGA`IC8p+C:!}Nh,mn>_MGiq'N~|z`|mu}r:"KiyGҪ$& hw#4qn?ܶХfm_Ov^ܶ[6j3ZN9t9ZMMM)I[Rχ/C|W䳮yI3MڼH9iEG&V 'x`u.̀ab7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S_;I/d0oIU:m/~[*K1QA="D:V&f:{7N>^uU` c/X)mS5KC߄":{H)"%,!3w{"ZWÂk>/F?RJ>FIY*%5Hg}3Ď89؟N/pgÞ tJXB-Gjsٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?i?S{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu/bn/ʛҜ ܣTvnVUY om?'4%hs.o&˛Sy*LD ZmWbIG]0\mȐb#;|yɚ YZgދ8H KV,XHS4OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэo\΋"?|NKfֱn !nUgu$ B6 [^7 |Xpn1]nr CC5`F`J `rKJ;?28k&XF?Q P7Sۧ2fZ gbXfX`cKo;0*dl4dvۑ^]5|XOnI-DaLTBѥ] S0E( ?`5[Z!]nlnݔn,?WTm>C9O n6HNe">0]8@*0)QsUN8t^N+mXU q2EDö0^R) hCt{d}ܜFnԴ.2w⠪R/r| w,?VMqܙ7;qpUۚ5Tnj ۝jlN$q:w$U>tL)NC*<` `)ĉJآS2 z]gQ)Bی:D`W&jDk\7XD&?Y\9ȢG:${1`+i n8=%Ml%İȖb7AޗuV3A7ำqE*\qb'YpuHƩҬV nm=Ɂ-2=|5ʹ zi ' ׹U>8bK0%V\ t!Lku`+]c0h&)IVC)p| QUA:]XL/2La[Xѓ F;/-rtx-rei0hE˝ݸDt#{I} `v;jUvK S x1Q2XU&6k&lE"} Q\E)+u>.,SzbQ!g:l0r5aI`"Ǒm O\B!,ZDbjKM%q%Em(>Hm 2z=Eh^&hBk X%t>g:Y #)#vǷOV't d1 =_SEp+%L1OUaY쎹aZNnDZ6fV{r&ȑ|X!|i*FJT+gj׾,$'qg%HWc\4@'@—>9V*E :lw)e6;KK{s`>3X: P/%d1ؑHͦ4;W\hx锎vgqcU!}xF^jc5?7Ua,X nʬ^Cv'A$ƝKA`d;_/EZ~'*"ȜH*Duƽ˳bKg^raͭ̍*tPu*9bJ_ ;3It+v;3O'CX}k:U{⧘pvzz0V Y3'Dco\:^dnJF7a)AH v_§gbȩ<+S%EasUNfB7™:%GY \LXg3۾4\.?}f kj· dM[CaVۿ$XD'QǛU>UݸoRR?x^TE.1߬VwխmLaF݄",Uy%ífz,/o/Z^]ݖF\\UR7򱺹...m/~q[ /7n!7xB[)9nI [GۿsH\ow!>66}եl?|i [%۾s& Z&el-ɬeb.E)բA l1O,dE>-KjLOgeΏe|Bf".ax)֒t0E)J\8ʁ,Gulʂ+lh)6tqd!eó5d ¢ku|M"kP-&ђ5h ^pN0[|B>+q"/[ڲ&6!%<@fpѻKQ31pxFP>TU?!$VQ`Rc1wM "U8V15> =҆#xɮ}U`۸ہt=|X!~Pu(UeS@%Nb:.SZ1d!~\<}LY aBRJ@ѥuȑz.# 3tl7 ]وb Xnݔ[TN1|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNEYdEcaLF&"FhQ|![gIK v~,Jc%+8[dI368fp*CDrc3k.2WM:UbX[cO;R`RA]d+w!e rr솜[/V`+@;Τ`5d0ϕ_Lع`C"cK>JG.}Ε00e>& 2䯫vNj31c$ i '2Sn-51Y}rE~b>|Ď6Oj~ebIapul9| 3QtUqSCxTD7U9/nq.JYCtuc nrCtVDƖϧ;INOKx%'t+sFUJq:ǫf!NRT1D(3.8Q;І?O+JL0SU%jfˬ1lމZ|VA/.ȍȱh M-r ~[0AG꠭y*8D*-Rz_z{/S[*"꫒?`a;N6uilLn<Yllmb rY״͆jqTI!j.Pٱh s!:W_´KxA|Hk1nE6=W|$O -{]1Ak$ ѫQ6Plp;3F$RveL l5`:~@c>q,7}VE-Q8W70up˳ A¦g/OEU:غA>?=CۣPqȅlW11/$f*0@б 2Dݘrt +qrx!8 J&[V =͋A,z`S,J|L/vrʑ=}IhM4fG(Ȋ1{TT%41Oa'$ 4NcZFz?RE!ݥNH٪E4#[Ia`-`[ N6Ėq3M%1HcTw=P"0hF})JY쫒E+,p U*D)xRi(2]iHc_)"7{k=*Fʰ)z}ʺg<*jDUaj?D${k)*"nw;f<-}U먼)w#C*43B+RVr=69On7fWuarB\ RㅱOxLOt㢚EVx'9~ErKT^yhay.)1]R6d + b9k#207}|-rr /<.\Ӱc?6W;o_Wc;UǦ;7]#nqr17fڀ1o }m0XRe=u7]}P;gy3q=Uɰ7|ø?/jĈI&y:x'+]xME&W{v&/GV`گHA,LK"ßs' .R! \&פ) :zW@X$޴<>2m5-,@7,KDc=7GfZ9cW"BW-?8[ -b{ca E&U}GD{e28 cMZXn{<oz}IaQBnNX̒|@%"DUunJFQX(־H^C.J双}bĶ,}rmDQ neَ pcPxqdOf YfS? @qDQ\,Z~wD!ak2K'a dE#.ei`Ka`=ZDúC/ HVi%(o M伐 (D ()V fyR V Us٤ >1B.6b'PXQ{|ea~GS!F.{eTMjV]Vv`,qKr9KoNF,G(7-5ٚ:xS&̳v9+Uɸ}$MQJFqS7~^VG,MMкJBpS..|A#} DduG34GO)A;dp X%n<2Ƞ=K#奈YZɥqw*TÏn9؜?qgyeCm#=jr\gy'ގaۃ.iD9[<-gMZ'E}%Ki^%y]*yt~JlT2lԎҭ%ׂQő Ŕ*il?{5K?H߱c`k0GN2=)0otheѤc[:l<̠S2$ 鬻%9O-W'IT~,s]ԩ֩Tgzx:azͲ"$op:g_}fi>:/e^gwJxԛ$n<8@G' Y\ٴ i&YkXIbmLtjʑk.JMmw"jʤwƯli {jW`+BqWeS#[,+>UL*_9YDw*f]dGh=F^"%"l>ii] fxCZ}|pn^IS!ǣN6eUv.!XO@JO$-{۫9ƙdN+OnXCI}_2}xWީl<0t]hzH\ÃMnl@k63+cQȲZ(bxE#nZ!~3((,s{4M~Ec+t$><&`&;,2.Nl=I&3_1N<)W.@ Vhdt/c&XFF06!Ui-Y  4SvNf*jthKѲ7Ue*2@E 4ӽ˳7$&w5_5랰80@1bZCd\۳/s<31H*Yp>nO.rn)sht3+%DHx{xjY q_8Jk9DUFf4"@=%Xr,6d l9jwvR=H}uؔW߈ڗ")ل= }5i}Hoϡ߽DvX!1te-:uʒʵkmHLpS .Y^%{=̔o <ؽa&QTyqy ,x(G\#tJk,XIe03w:s0=O#b}ha ªmn@b%^aab@YK#rMU.uw6M%.(гKP,' #W%% eI5Xb ˄h#[ ۣhaazmϠ7$(2mFTb kj5H"RWx}7Bsڛlhuk7FE | -T}Gid69۹Li@ɯ]SEϷ)FUUڡ+jȢĤjV$P)چ{HHWߐ`OjMkmd]Y ZA! q ,iKmy)UzoF3 hD]eVuET**@D`QRhI%~BS@Wٰ6Tm]e,iM`]!EqTu_D5Qmx,e"teڐr['xAH<*gEyh CMͺw+t4*u{׶aq7+æZ{@e)a]MTZSc|:D`2˓XT֔)ъT0V$o U 󵏅RDCfx6XoЈ 3p^h#y[)(>Bݶz 0#fX3LS5O^}xS\/t1]&jK 8F` ;6 Z] L]5[+d>1==Y]$G(Cwv{k2ڑ$b*m jZTQ.2Ӫ+ hUhF s隱+KEU+рizlCRJD̊c1U)ri'E MSn ^V>x˜mn>5 aUy08qC ^7FFP |Q9u Pixjw(v:HMg F'w hO@:Y.0S@n,u%Xl S٭FcAj%p~_"84A{;'ux>8:kQqlֶ8:k~[%{VK,rwۢmu|Z]p]k|*NiW2/9ƩمkC3֒!S/jԱ'uT U_r|GO b]F!/#a: Iܒ#Тq1# 2Erh=/ E $huG~T?v N{5Xjkgl*ߋ!* i Sdk$ ,\'0̮6n9NlZ\Ug oSzBtY_e_~61ESby}F;gMHr#x`% 髡u{ǷzI@Aϸ3O{7ؿ<~m4 ,)>ushޟyLLȻ4[f^߷aj:dwznP4Gg'͕Pڅs+:jsVކ, CK.qJ~A"$<4<mig?A$ϳ"͑cׂGu[Pݒׂ&a=&Sz"ץNpw3p6;av|^4-Wj$D32 B=cHbAI~?7ƁQ:"խ<4J4O.xBoyԤD>^xa=-R `ׇ C&4¥ I_xki`A*1qUfpe,KY"᡻>A(/ۡb i4ߗ5l>D9eR:u2IpC e94{q\75'ldK1$OY2p]R BCaECE P(mF \LsivD0H9(!T|=@*j6 A[GCxjr`Wz<<$US4RV|Ѳ@}8Yg*&zYy{^5o &.7\8~s!"a\v6^/9U:g)npY,O]$⌊"Ad*E,Y K 50Qua/-҂CΖ%pv}Nܔ({eCP#ZBfAjHRdˊ|@% ӡqÆ,dgWTr&~آX(I"W A1zriRD2m!B ]3hU(>WHNu ؇(j$.)G!?UT D:B" SUő*(7Z#BW+ŴҰJEHwB,6` S`@taCf >u*ǣ " [T|ԊyR\A-#NR;i@Fd =x ֯P6|v>台t[~$OaXpa !~(*wj=LZʋ'9ÌݫMSբG,h,함mw{k煦RXcخH^__O\}]V֚U½{mqj3u ` VsOP׫CC#*^(jӳuvytXCu9-^#[\LUW A=(e Q,Щ,Q=&$tJQ$lKu*D7;mTƲ/n[O>9n{PǾm X{`d!r]'4oH] )z`<=0eZ{AfoƓQZ>q~݆ExZ6+iaZAyN qќzj- \uo;>x37ҡۢ n✀ȡ/A%*~>Qb ZfwU|$Y,< RAZWݥAgZ:J-P+ uKӬ2YjyrQݦ8T[>4׳Иg`]O^};BKc@4S#ull'>gIrPRU'R/]2*!;Nq2_+D̊ZW>ʵ,G _z#f( !TInK8ʖibTt䢎0oaKiGs0<1Z[[jETQ &hϔ׬nWpz%SVv aX#v5%vn.iG,<\U w\lJV/˟p̵Ѧ={Z6W'ya?W];'l_ \Jy~z!!wQ crRԟ$%:7h._Wꢛc)R#P|,?s`(Gy=G/˸9G =8+8z{ߦ {@/7R4-,6K(-(տsR[MLqwR4%(iesC& G$HR[6lng댁7%E Xh0MgH$hJ]ɗoHRaYj*v'یS`@Mi!Avƚ`R[,\L,* $Ζ^(bр} AM\Y f*Q.rR;0!'6%Ig+Lk#E<;pTi8bE@KڀQ8Ie=|61JHnC֩ `d^C5G1r @A ʂe8 [Yp4O]V/aC9fv5-6PZ *p!bx\kcM([,QG)Nt|G?oWc*J-]BEt"߄&i0`RoTKt%EbCt-᎙8\N@O 1.LQ kE_;c`p*9WRP3'!) ؍yG$ #ƢjrYE٘=Ycv=Sv?V*360m IAl[KӞ1E둛ĂSgz&z{QaBR{62韞\sݹS'UFljo4Q@yK.yz+ iۨ?p֋hj;ÜĤ ZhAdSchozF>\!y%4 1>nwcm3R-)ƺ* 1H>{ls,*[[O)[bLk;G3R;T+-tH$^~AݚbN25 v"Ae-eף1(%atRU>1WK ?Q *` UZtKl$[SGTvt 2`]iK@nU%H:)`|eZ"ڸrXcr Y6J"3!\ƃc6a{`)cn.T[r> Fjԩ4(R^YU$7Sn+")/]8 SШI2y CYmHKDI/ա ~̭ .E 5 4l`h-^e%Eor伙H,B朂GI &.H:/U0Izhq~ьBDL&фTX` )bObYHsk娓ND`RK0dJcX,Hp0G9Ivj^Q1 8a}"P_yxp^.zB`Һ\|{>jRNT )7曓1f@20.CVhNZ\Cė3{k㻭07<|nJÍדe ' FX6{@BRh"}7,/Li~1E>NmՒ=p!ETjw0FwוlEM`Rf!3eYLk@Q$r73y}U㎲rzi ~* dU:%[YA$ɥth>vŦF\>H:1?}7rdyrd /T&Z% O-ZCH:]$R7,= M}ͣ׳`j\Vj %Hϛnҽgm{3bܤAoPD<-G6cxd$^-InΙaɚQho"-= qnp"z05& +)ROQ8$ਪyh`sGJ4V14AvVPDV[(N}|4B^T|*Rݢ-7rh's9V0 ۗO,ډD9%fiC*jɠKНA'6ʈRv]9c .{c tr$~Ir(]CK;4~_8^j Mfi"Pz)/9F ©Rċ**)ED)~ĂcU_uM(8~٫k@[),\_d2Gxyt],>M ,0Nv-N()C'u_GEF(yqS@e/IV.a K ]ԓqSzƄ?F(lc`U7 -VSJtn?' N퓙=5Q{g1YqP|^2՘P$^-e];G: oFU%A d9|et)Ud"$f1Ġdɗ郭ֳ\m:8^fRQULbu謢X$<+5sh~GR󄜫\=q{I՘wɣ6%b^mC^;8/,]>4i[!S-t9pq֍SdnSnyJJ+*!EolGhԅSew}5Ɵ.ᑓZc|au30>$k\P%`y_!ZӪpb"a3^30vӘWCqU2NhaX~^JFGn=ۤÐTc䖅U-g6F7ؔRH%t{`G u|' wnXpt񢦭W(l^;B(eązyY\zr1k\n yE1B<# d<\AI|Qtu ; '5~٤eSfoePRDjF1v>oKׅ>C]Z/l\ >fwgϝ^toK>VVWSJYp~fi_]9̃n+^*A7A+E13^Y%2`g{nzuixE;>_uw\~z}{r7VZ.wwy|cFdwH:_o>/<ªE9pzŃdɧ)|6f*m(ݔ5MF*elD}??~ς䫵s؛-fz,pk̝kڳD<>|G3OWL>L>ژƯ+X ]RC&k7r"fTg,ؒ0X,B]`f,&2 ~mRq79y>"1&Ea]qrάK3hC%f1U\ĪEKsͻ&m,G.j[və}0y)S%6޹嗣vq`Uq kEy0sС} ,8ZvaГY5cVPD譁+hQDl?Q$(リOR+7P %~XuS" ϼ$ k,@+t,ZVG/))L#z%xP7)N)YvʂdV%~Z! U( aSxkE= w< N&Efjr6e0|%tTZL&ר%6L~{YtY[!}1FNVWjÂ*oyAM:7YHpӓ`_p Eh!(b^4B#a z~~'uxd<ǝ;˵shYpxY/q\m f1i#1~ ^U+mS')N^ ]|& 64r D_ôE)GR92grqѬ;d~$WkuG%raApx๣)a.d 3,cS3hh=^3ASE㉣Izlv,8ʹed[|SV4V(z![R$1:7y|2-A{<>t9Q%VL1,am #~ tD}I3ZuB%w]!m=o"ƺ>.e!^@Q5 EID]#+P|.$8*cF_MRUʂcĉg6Uk86Z#p7|"xh WTLz)ѥP$|d'ߋ8Fr&-Wݹ[eU^F(%0 TB$f/z>2n~">c{۸Wv-$ 5vǦ08$'Tj }9#k$;dm%͈<;C2yMyQZ.|x>̷͞n]Uc;E5Q8$`q=q,4ۺw[yL]vBZ^+OZg)-sk A^\zרkjl3믺)I*bݻ 뮫/%$;z?wF=O"}'0|I9N EV•TSÿ(j~L\@ %pI׳ k@-g?Lve0g0f+s__ct웑M%$6DAXx D>?e63E_\wq6FEeM~WpL/ :l&,Z*G<[%Zf% x 'p2Yր"88:EEf<1 %X֥tf8W7'eRۈ&8JJu p8ڠ7z3ס(:Y1u d .E3|쥷Wo&ɥ0Ǘ?*^7&p{Shp]lmua-+'a / O̾ w^EPn.PˋIM{(@{r2x/Xp8 >@_|# ؐtl uqL(#r !46 V~Ox&bwPPZs"҄%/Ḓ 5^Z5߁;+0v">TI)M#ͻ+W]}bꚫᆙ̇Cj+d,[o.L9lWZ"E;l0xk֠_kڵ~9#J4 pcӻ1.fK=n]GΕȕ'v'Ve Iܥ v ļ̀"@^m4BIk*&9 Cp*x`Avfop'FVqvݛE4-3)2+aҎL.:m>jGUA!^E>r9Фi/l|[]hߕ|oM[FK?+Dzv8wo ~T:TQl]j±f$fmFdŋNR:@_+Sٿ;tsgٞUfl hȇxj| 7~7;ܤ:gqw85vV #ƏiX6rCŏߚ_5s5h6e΀`Y;d3g\!ڒ9ju\p|y>'LFG0#) IFBA&=I^Ry&t]'R k-Z-@G[0<\,ءXӳi$BpY۠%$V]Zր&)E[$mZqՆnaxǑm&miTKEuŤn40څ/Mw>o>(vZo7& $k zr)ijUm8n4`Hl/_4lB?[m@[5kQ!7T'AVR.k4_Mg^Lc%cJYM (0>Z!#c}QЀq4Pk9N0*SbwaTOjSƼE^T¶@dYPKl2 5 e IݵhDֱT,UJQ@K u]иLe0cXPe.PR/?γתYU)J=HYfu԰]'A4$•ŵq(o<:RKn68Aˌuxwi{jՋ㪖C3 T}rev`3L(3+R2xwf'Vt+'XKvXg,%RaeWQEF[{P_G CE+pS|`ǽ p8e9## ;*qLd++ LRrL NQݖrÇk ڣ1 X,oKSTkmU8/hT#B$Q6{,Rq⹠KZ:+V4eV;0GCfJ%t9=H` J$0Й2iq;BBӎào1r7BMN |EhG 5-(TPwqMƂO @f#v'1' Y +9zP>w |nKLj/lB )qB#σ ⣟FM(>GhU]ѳ('ўu{Gwú,ŃZvO3B]\$"?gyuQqwuym6|Y!>sR**K$c,5ҤbBR֒LlF` \ֆIWN-M;j$M;KIvc^lÌ޷dXlkڑ S*Eetɀ<_93K/JөRZRݰfݰK^Kpz)2J~yU+XV/M;j$M;d`#3VjAR0!M^">`eX#n`ѡ"R.$1d4d4a-mҫmՆۤ/3(]X2-B4*.P7ɮa_?o_At, ]lGr3;#@:d@dn\\%`WnǸ7=9N%Lwv2)|yl,&mQ;3˷rmcja䱕H><6]1kB8'Kړ'/v Tsd4MA]_nl#RءI2S,hкWp8J=uQW%0es A|-K( 31@ ĖjgUQ:5'"|EVցi*}t %m%[[Цyb6h毅f&hf.l(-( q0E,KΊ'7 ЫpKJt"|Xû`>"Şa3oz3E͚:.۱r b.அ]`MUbg 5zHOL? ^ C$j^U{e0kEI}I! 2\7 St7#X{NnD@~SQ wO鮯ʇek2 "TmǛ2e+욐1Q+͠yqr5*&ovU |Dd5\/D_IK1-Y߇RI6K)$dG/B#n`ѩZeU>ܧ#NkS\(d4/g, e Ջթ$LKg b(eP--K1e0J13`xⱤM5<>f>v˩ IS>7i#p2<;=;=oBX=.TRj3bST 9!eI#K0G2aFZ@I&*N5dc-e+xrB8 Z4`" d ɐ2B5T(4"ei&X 3BSJ7B*w`Hl=S._eX] ʕ9€w @cp'UaiF>4T!z?Щd1ҌPlÏ@j+A 0KɬC8kp \JTg>L= q5R&b)i &PJڂծtфGMYYعac`pTĚCQRb8K]* f^-?U[b5_cJX4Y_Yd1)t*DHF-ZnjݡJB.O4ʄVPH ܒڥVU./(~n 2 +"r[3"eڭZ-xu{VTViZQ3 {kRCYS,k\1ScU?)._ڽ`e[QjSR1аp9:pG2{m>QTSSaXYmj"(*L8kzICD˥O0QY!10ʽ,h]I EK05 9R2潐ٻ6,Wz $gDXl`d AAה!)e4nCT(YL-W]UerI a4'0Vۖ>A:^;Sl'64W~e?*S>щt'Hr؀șFq&%ާOd<$K`\N~ EͫjhCRpThL4!JK}! !1M "d j谹)7pBʸ{;l IYAz) H0HKE(e*ǣ)6klH.3^0$7Q^?:(!q%Y?Zhռ` 7_i30z•2J NLlvjnoL<^'\I%9 #t.iZ:9Tw-Yݔ7^D;(_LuㅖYFރ)֜YBY?wpe<ϔ/,y8P_j{Ԉ7^0M[B?a>Z쑌izKQ l.p|mEtpk7᠟WC^ϤFͶv:65V3?Ÿgo[L쳽hly &56 .3H޸&WjL6T<|UfƓN9aM.kD+m&zHSdJiϺw>|nc%>|?B>%fr;+xe–ߜ/0t+ -vl*h| z{U3K>LwHRaf}4k0/YQz(_ GW>u+|x24 1SˇzsBDKĐ'0ț!jo|N<Ƚ+vLzb'߹쒥1s}0#ԛylN{n>^A'vb!>Ha=?Ѿt__~||JP<Ct5Mgs\[.&zYS >e㤟eLź7Û-ߵkk5ࡳXcEuL v..k+'1z< X4`kr DMwCۈt ;~>5 JUW+eq4zTUH>^Gӫk470R`-x ;y8>*u)r1~9|sx疊QJ`ɤjbv22i7C.#Y\VT7_U>f} SҺ9Bm1@Cw}Hq☉cTƗhV @zN Pz0(Y%}}r*N\%8N[DeVC^$a8eLJ63ј˶0t8k%l`f7qsv3Pؓm/*4Y2箹FCX:0i*]via\S]U-w"gAuZimh'*U6u 0#M_p%?v61Eu֓Q>IUm|Gk,-_z;x>|w[@?Z4eg<`r,p]<ˆwyUY;S-FZ3׻$^^WGLVNޘ(QKxNE"SRbJ3ep$RHfh`1fT>5չ &AYL P,R7Pa6]*-@0)c~ n1 -&0P_f0ƇkqD,=q ]#m&sV.nݜUĜi{QϜ^dTX>j-RbKf/}_||{:\51wOV(&bdeg5YPh6*vM9ۊ-SrNZ^v.7SuXHV9n ,eg[QN4Eu'q FJ3jF b8mwo ~P#9@m&q=I|l`݊0BeP_ߛG inR 8cuWAƙ ZX#m[EU7k IZ*9Qժro?j0D C ̭2B%$ 2ܳH8 mc`bv+`,7 WrBj} 0TSEU`"%ץ#Um} a7(|u"*Dof.o0.M_F!)>!\m]P]^jC I䕒ꭨ9DUkgy mtFE`.؆dŬ<s6!ŘhCkH3 1)YQ!`h]'m5REB|ýE-WR3mM-ya䛒GS,y$U :I>o2mٕ 2 cp|D2x'wkN[s9XcX 4|Uʘf59 Ϝr8 QEBjSHA\MB9rڂOA^ ?ILv;CICVŖ % z0Cx 69'4K3"Ֆ+}}lm) jˮJ[ׅs5255y qK!/͕AglcB:,iz`(XZtFE(pwI͟pkd4inD~U\\m>XJiMǿ~q{DwBԡU=k˕եcv?d)728\0eUWkp"6[=S'Q, v 2AZdowP3S]8Fʕ"vN4 K8h|X(|h0֔vSVW 5RV@IT52XI֯p=: 1ilTq?x?yҍcJt \ZՕVVͻ!AiAd#+j\І ~j# ɶ[;GtcڌCQRpģh$W4P@ܵ+GI~iF"An_M9IȽF#dkρcq6|_Gy;ew"#c~=/a,PcAc_15nubT .@%qx8dtq`s2^}ׇAj(^#`/;}GR*d"YQAT€܉Kpp^;_J_ՠ:fe YF8 J䏶G`*a}6F0 qƑEoq2bq*$@ !P+r49C (5cM3/+xh3XV3/>B?b<ӝ3A+H!zmF͝. ;؛vuHqXê6P)͋sC-[+DB !,2_"3x"n2Kc5(c&$K̒/$(.dH e[6DV @ Iy}$Z'~b֓22ޣ yo].KԈ^WAre %{oyIV욢h< \Qz{D >c Yzrc)%$4_:垼!F A0#ik$# +V&|)SJe/O^J38_FT֘䏴gRod<|afK_ ǘ,B56֜P HӼ~#73HXPfvO67^lc|wxM)ٷ F[޼^V؇DlD8/'?\.e_Mc/"Ojz{+~5e7?Yw-ۋyޔ'ބQbslߜ+bOI(0y_9)K_!z:c1Fs}sl\'Zt|RvDԣ[},pQG0pM#tiQ9m+|˩*0X=xWO|^q;`VCsU b$hlA$|NTWSSQe!|U*O%)_TqWjٻlR:)D|g2Mŧxix/kˮ>Tc,qr~vy).b&yL[ߩtt;$gz"\ԗMi<(΢ETY7U[Xۋ/_unm.}ki~)^X_C3*͘]Q^B鋼ڣVKI ㋏*/渘~+&uITI223aV18kĸ.t4wX0a ]QrsnJ S'l` iX.[ȰGkd4 LަcK4bH36q(U.eϠ$#)/y2EE8o8/1bV( Oؑ`d40Ur6FZ_HV""d8MgSK@N028Te5:̳T)Td`A*zgL{FuX~hJKy'f$N5u@kd< `օ_HHm0j !1 5+͒1w֣)72ve䁄 ?al(Kަs$<d8l*+9 t-($2RWɫH02@8i=u>Ѐ ez]ۨ yL I02@bY҃̀x ]Cp )K.r^6[FF- 9dpIc1:/lpɮi;;k]{LF@$ ˵2:C)#W#协AAzZ͛mV]x@y ;Fvx "2Q`d<.'A* 2󴙣-vB&[DE12}r6?ԅA~0~#/Cyu!Fh~#A|C"!!O>tA!x猐@ 2 U\׎3ORQƨ;s(H}`d4&aNr yIF+StQ3S60%- !dH_[ g1n!A FF#d0x\"2 y%\GM]GȞZ%Mr[77smPz(ٺ)5z9e(i&&oZWSV '=3D7/%!a3էec 'AXny*v| /> ,F,*eXSzn8=}YeԤR 1AMW5+T*! 0d'RfGd# e]bS7f/TSq%S94U6o+n8DeЪ྿Vj":flWz+S;ZP0idlJr60;tͪr<{A턥:IF#hx񨉧T9g3'@2Jw Ѱ)72A{:bA{3w Cq! <^MwmӽcDUVݽջF6.?fSzsqj4@ qdx:[v/Ϫm'.eb1_Fe|+6ta9-W/_ Y D/BqS1?:F'F MKNO?N%~y4Xͅ]/B.*x'*<} bhk'3%/RM=OOMQ ➪՞`M`hNRWZV;̴~*S=4Z2z}t$mge4 bo*iӴe;0f1;OrD)dRIBa&Cv77 d2-{PY=r {W`@y8`{J>*6+HCEٮS&j";#Nhw$~D=S{er> +RVl8exh(7)(\|.$$43.P@x cܞOJbN&Kp9{pъ'i;b3i:ΡxyٙD-Sж*]4N_HÛ ^5H̼~5}ٍo%ހ;Q=8EfGKjZU맳pҤu\V3.gRwh.j9_Nn?W^7ͬa;vi/҈bv˳k럟SZTߗUw_i!mVmf}ldJf:kb%u-ꞑZX,~U,Ŧ9G?[lRp,YJ *Q̺ 35pD2zIBb/[Fm6DN!tґSs8x_"?$GI$"M҇d}FϻUnWTv{Dn *za!?janAɻ]Ϳzsr6Ī%}UywSpFXpyȔrʡ yM;dg\LTW">dK-&(ײA+d6Dk4Nt6,G`o@9AiBUH^*G0^ lV!"WH0վr)I0ѫ4^W뼒F+h,ۼ|~ZĄy0*?9bEۼIፅiNjqro"' h~QW:1<&QQAQ-QpMm}3j bowPf*<HnWcUz# `MocU᪘ZJQܴE+I9C ?ǿ2׃C׊$;X[P%mJL냩rևꎭ|Zq+q\:!.wFxH .IMa}d7"X(15h\X ی}Ga[F`h?kՌyМZ,Ƀ8i"C:_ޔR'y+(^ O8ϼS8s!8%%xF oGܪtfs]㣺!bщ.&젮w q῱0mS,8BșߔrvpaѺP2xGosּ+B+3鰊*>%EɻݳWfޭ/h#ǝP3KUe0Tq!O]t$ \d1xR߳d(rW݁w90"ɋskyL:UT->x{ `!~Co5`79!T9}*O6I(Wi޶N [`d= y}M\bhn:mv yl֟( XˏSy@MU?"  nNssUvG5^lM+6Z^)Ͽ]S` g3]}fJ,Iĭ2'aT`7 JaUEVB_P 4KŖP,RF;B˳/"72bZ̹h׍+u#aaM5f4c` [-8zA7';}&{ j,kɇiGntHV^ ZAǨv0C4G)ݞab,ԩsKs+/ٝnh!ڪZ7Q =o7qpeu"M P.Ud%U$NzO k]COnbxy\qhVζ, GZf͚/.~_gj-a~@GU!/xA(94P*[1>ħLu"U 2Y?_e*dULqG7ٓx7IW~Ž_KFgR3$IScBC!4cޮ7GůWbAiiGMM&Q`@4%H*￟Tk" k=~}Knۛic`2A ~}]'ù;U i*D:H:fo?%M %|ۛիꤳ.):oq@taH_&ƞ«z7tRBU*%fW?M]QЬ\%Yť OO6i`ۮQ/=x]Ǭ - VwUnvq4RF3鹏Ӟ*~_+XcUgkٗdDTѸn v,Z=IJa!5=dZi)wtl!O:G)TO)x/3v\7`mTwM8Gir9gNB Y! R%se1:<7çq ;I?d}Kh M>q^fPG 2+Aʘ\e6YbiÕHT"1Xd2iTX뮽oD[L\"x2h K9g.c<'R$08.@eʥ>&aD {?-r,,#`ֆZ`3IY y+,>KN@p#Zr h{P4$O:ÂY3py,wV&J'6%n-&Rmҥ<2J0-2̥' yBZXA8Xn-IR-N2@vAȔLݘ#* m&3$YByc4q)>;RfaKMlC!Ӥ&$hj"ڐ4mvq 3!Ir*bF#Iw@ʘ$4$wY6׎&f%SFݏ<1 Xaш̊=I G&3KSɩy@Yۄټ$kЬ$ݧ"#ʓ%K@"XXudݡ8TGhOBw եfQA3IP/3^jXe )KQ[ aݫ 6p `0OÇup2PvU@D)Qnrc(e$kmXBsCK`1wul00*zx暋R y7- U A8İ(հ֞QP`Ѯ eջV(I*RO%\$Ũ&8.S\ RcK :<濾Pb6ANqMHVG rUflg0M V_ABc\B1Yk@_+hWl1wBPAޠ\Q?1d a]HvMD!ePDdEs$ɆN=qq،:ZЭ9;A]9<V`&qH DEj$Qࠤ`gLJ5HJA4`5V[:*XiA{Sl %J92j"XЪ#KF# 4|Aڐ!HyީQE..{URRqm+5Sc9Bb|)S -J-8Pޭ 6z"b $* .eEM5>Vc][ !#h2  a_]Ō4ıb)8ypRTĔA0BN]~n?8q}eF ްC-¾B&ܦkD֊c8m d:p<`8[0uJ&ҕjJ2ÃTH (va( V !A9J#HV 2ݐfG!`F`IpcAhB&Ȓ×]eUW[_P܎m@GwU,TGwfl.ZDbޱiya۠&k)] !H b"җb|}zx|y7<ޥĢX ]K 4BXB8;K+)G^bUmBe-%1k z;t ((¸ Q{Xsv*DI @/ɘ+u@[^FWP!-\f%4HF^3rB`ܱ[#hL ̓THDƢ xd!ne6j)f,Pbe~ЃRA*BC(" *4\v;0lI1#lL(z* &Š`X!]ud1mpdH Pf% m;C R-Go4"EUc7Pj|$C}oŤC\[QS` nzuY4===?9;m797 ecU ΂!n,GOƗaup0Jv೓)Qfqq0 {I%A'Փak6oFbnpDҌؓXÁväDl1tP5\Q͡ xy܈!ů:)QJj *X#H`!H 7\UUG`=z;2TߺUoa8FV × XQWHTM1n :9Pa~+䯺ȫRՀ C +5Ci#b NZMp 9LBZpqԋ +υiDF92`a|%b90wҭxx"Bi VXê1j#WNR"&bYK.ҬdKH2rO˩ ˯]0p$@BF9m 4xDU c`FEj?NzߟO{qV0*qfAbq,u.$wkO~Ĥc>tV;ٱVz\_L{\?JEIoߎimvv]u; o/o2\?_\\?<Ǿ ͯgVx;k'@(> ^:`tFxޞ`a!l!8uBuvˊ܍:Qca;4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4L4Lγ58ݒQ ڰ;QǍEi=uuQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQuHP+d`1.u67i=uvOQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQLPaq{ӱt\n{j/x`xz4bk\nvۧn%ۓrvwv}~6VԦX . ֽDJkl݀mROW4]rtG߽KH Ͽw젟>Z*\ r[r[K5Q6njMf#`%_!F1;~OEFZq^y#`Șl% $D66 Xvرl3+OMҞ` LÁGvV"9Q[KjlZAVV&Xa:(X,4J,V9xyc+.Ǎ*TÁÁ5`bFJ4NLaYWlxi;`mtFt#`}FTV€#qȆHJn#`%r8"`c7"6FL+`ÈVt#`]li#`ɘl;`zuq1l%u JF=l=KcXk;0O덨JQ=S2hnlk, 6C4" L XgodIn,[l[+V| XJa++ȇ}f?l^gX ѹ)" ÐN헭7.ΎڥߵOgfT޼m^5a~7o5p}8h׼%2y9{>ke/iv.|b4Aɲljj\Zj.:!$ۢ}t%qT[0)?%/DOYi:Iڛp2ףO޿24ϣm͕^y/:'ivZߟ^m9xsT~c<> \ww[>>);AIJܴ&r5c _/vb#^}ua>7ՂUGHera7>w`yzF=z>Xe+,qAy[(}yvֻhVb\P1@76xVbV~~+'__^h|!ܿ7^-x\]-xZ{{ڛfOBykje 2Jv;~wߵ}u^_6j׾jy^v_KW#1;9Q s?-O?w'mK{V53ߧ=제޾=+7[5"bYnuO/>Oo fGW//֍y@?p~d?p~x˫y\^ g4FHEʻaY3p5<csǬk_Vw9rCcKkI05fCh-{dC>hxlkvnOM)l1&jO71wU!MQ99f.LV75fћTƱcI9WWwpV-u15￧K3އ޵jCuSf4r*`xTL-Se! b6KS `k5DSYml.\ nFD:\FҭզS)rW"Z û^{g.u4QN-n>Iq,fLs'4fpΣZN>{ͻ.T R%==m"QRGM^6Qxr*øl}tLc0&Tj. 7}$QbpGx%D!<}޽G&FESǕnd3<#KƜ8hKh^H)RHR !>YU;WSI E3 $[qhazHr=ڜEQggUMNɦ)i\&qu~0aYl^!-I\ @"Q;S= %!Bb =,* 5`u^*JJR[0L7޵d׿Bc00fY'@6Ơ6!)`{NuS|X"Ee MVխsϭSW%e(]:@ fs ^D[גRU[BCt$v(RcBtfm[ ×FFPi JM@XM`t4[V\ Ԁspq))3rԲ%\LT8S`yU+0A[8sq *6RAѠ0RJT6+uԂ#JmmZ`0"0N0MFPga{b$ c"X4 PK J8"Z.;_} Z([Ȑe'+RrIRADeYd4!{@)o :hr Jhf4ALAG:HrP(,5sg8-  pe EKRع>P p$@fRJ TuFABQbBF$d9\030c<@yA2AA2֦F.[m\ciY!gѝpG4F4 3 o%R@Qd*˫Hp[dPq#@4MpKa@@& %,F +Z;GCoG dLy5qJM6;f#5e &%xKDdU2:#9^nX!?H.(\< R@ L=R,X(-z xB `YzJd}F ˂*HB-g؊D1QgmT ON .1Dt/V2\ JZcXɅOB"*Ɉ "k!G=] i,K~0 !(0$=r pNeb\+f'b?S+o %rAXM䃡~IS'Ȥ 'UCx?w@ p)kP&̌eeɿ+ʋ7@hSv{^kLo,v #Q , ZHrċ++HԃS=mO=zGXm(>&tqH e ]sEA~c4FA}EV,Rc`e|@tDf9ؤd>| pb:.5: dDy!HD4JUFL0P5W[F<ٲ #`QXHϥnb}plkOg$ol&s [he*e߭Hw;tWM#-h9y|ucucn>,dR6_%~60z>=UCJë;CJfUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPUUPU] Jg*[lP0P\PZFv=TPߥichCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TCU=TC}_zCfoG/LˆVd%+YFSH).:l3;me1]Zr4j|h2.KM+ݥFalOj,;X0ˤx&RN6*p㸝3>>F?Z\;̡ OFkhV0N݅ξ.\=c}.,9>c;Jes-^oVc_a@ni4kyя.OⲬŖez [>]VQrQ?+{TC@Yû,:lfzX-(OCx X?ĭ˸g^m>M7}6[ހnNk.&n-lJ ^CEb箌qzףC6z᧿r9nyZaq>.ql7t1 y?wfm=eѸIM}b2w~vA$MڏZۚY6 tXÁÇsF/E_Lv=\{ត m>EB#o k-px7iOZ4/^15sOZ@ m-4٧ 0FM$%[lp䀑W43~)"~=` ߰hEg儮_5->/.U/7r7 (&QmɬcdzHLmؓ7f_<3V>\bx_نm& 7YXsSXq9屖?V8\tkh0__SSc̥ 5sbtqFh/jbYцfz0WB>C,XUY-z Ő'4=xܽ:~Qc}O4&ۦgܾK@<ߞpCQ>!XqX= dF0;fD5'_:)_P' :4fn]^F ew/4FCjHak̷Ra W$},tkgI,>Hi.13.>Ӵrkyy''Ycw?}e] #."FH\wrO1D'!$ 紮yf^\-&Z#;z4en`D˻oz1jΩRut}wH2\zwƓ2fŖ|/~-c2_m4[#=[GP* vag fN}D刈{X:decͭGԉEp}.q}1b7" PsYl$csynžOҲ|y~rzõZ.!,襽&rnYë?:}57ϽQۛ픯l!ZZ 6VM< ,M(bœΙJy9P%Fi: mi1~*2ɛkn{EnO9ɭrzS]NbɧQ[hv>Mۃ( +",KhrV>.SCߍ͛j=eҖr|^~jokn_NG~n_rժno֏+?&i˅&gݼnqnv~3_{6܍KWR~ieeaU{,7;2jV_Ŗγ+;s7;6H<'(er2&v7:FQ7žvigOlsaCԕmQ]_Ӓ*#;zGFvytChv͉}I[RXޛ}Uc~>|7wʵmsʽT۩|k8 >Nj<Nj|G+GGSɤuj0ԺA{PԺA}QqLZ2%g̓Rr>8 H z3-2HC:'h!2cx ǑO\9ctATbr(8iZ`%h9S,kGθaƬ# jT\,ZQШIM 3"WRfKH#Ԙ=qV FC?p셄Jy0 pP6=YCG>g݃$~`n.edor#wtQHuDȉQ݅MRP!Pk%88rXlD5rGθwzQSa|&#`Ƅ)d\qП`1\cqcz+|"۾#E9rhXG3,@$`FI,f&YG>TGcxL>s"5D %tE3ˍXwQSalMvWW!IG yS 6*0`4yg!8TUS6*}Hn1^,UFH\x"Sz &hT ]@Oܙ Or}Cc(Xr"' R2 ;5nWAM^_=eE>kim4?\s0(sWK̀E`ەWzL~>-oSz%>١sƈ=nYW?_=xVtHˮ&uwW럯;|6YhW>q?h:椖#:0 NATW??#[{ )#rǻL}1_E<+^~~E5ysOF7gmn(sXf¢W\#dE!tCwM6foyD[VJ a!(%WQT\QOo+K1@$8Ra:IYDv9x"G]R'bQ?E>3V7zݯХx_! ' :gR[-79Q7g sJ Ѥa" ۀ|A"|HGx)^;HztFmuz"÷yI3V ܅ZT j̴ u~xEr×;'ޒ{;xDCDO!" p`7RKTFY4\:rāK䃷)q{!EϢP|1=Z} x=d4쎉~1z-ՠz02:="C]jOfssӈ¸ԀOAe0 Wϒ,`HɀCx(򉟌\R=>3;~G㪣'÷}(oI_bz) m;ގ/:J h3f{b!+R xyI)û;I7&J\ E.BK|*fۇ}px]0nr?ʈªk'‹ӢF .҄`֏4'F.en2* ý;lQm |q4؃@` E"ZJ Vt]Gj E>q3& m7 ixLZoQ'N ,)peSғ'XwQSQ5r+g׉_w_7'uI@r>9"qm@TB^N}>yOG*#!~Q7Eo[CTg),L9^rk>Õ֔ -"1)im˟vjCqtuXd÷Qg@/ f5}\7GԈ h䬝xTӻH.X}{+8l eْoi"vv 1~>#CUltoFAUhXrideA_r3ʃt>0GVR.ek{@ײ6Bb:06E1mvfucOX$7#VE*r֚x7%<2⋴&20Y$'bőOWD3׻= cڈS niF,b)g?պNU#%&_]Ảć4<=De~I-&?Q%2 uo>x16|љ(FwVn5F5)U$ %"8HoשߩE>0 P΀mU{P5уV.a6#IPcE RL(k/2:{oOȧB.B of䰣ZpnR"*rt4EK--OQhk=u2F%-[| GGsdI~ u= iehκ0Мq]CYh>/fmQ(.صMCeƌEjKƒ VgPNȥIpyʃ3u| Dȝ\*_*ՙsMCYs_n{TEg-W2+@}qã9|zS45 )Mg,kC,bՌ3ߞQ,K βD{{&~q;R39_փqfD#O @'RD|e r,E> l9MU~͆F9 ᑔ;E6+smwuXKj+@kK$U8ƛCĘϜ Tt|IlDү^h `s:4ypu ry놛i5c W kxS0ygrAΗ7nD 짜O@=Tr_g,[C>c[yǘ| hb]}фz@FݨUKXx3n9j=g곿dW混PW_VkebOH#1n_\C%ʌ݅=eP4T et^si\;g ղNPNP ;㺙8*4Y7PSA5!ƭÈ!+,ɪ-qp0\5F rGgPqa top-L2b2^+hŅ yAFCP!BGe4'fF5M{R y`i']0X>"z: ᰩS3B~Xw$lMn1|ꠗeleZu&_Pȅ[os.a[THƙ"ܨEpۋ:뫝.&9{C, L9x +oF7'B|T Ё:7xؠ*ޮkMo${`(m>0%E("MKQK8ȧBP3f]g<$/b|5JT; 0Ogo̢@OɶQ7 n\\dUǘ*&=r@FI[^&7Зծs?f*|rz~ƶ]P!op\)ULXW% Zg fѳlh%!?,[Co 68>`V Eշ mPEyKj `tυ؄IP?enf:"3 =H9d!ژDdTc'0/ŬW̢ȧA [ngl(x*a)Z2蘀W{94WVQ\AZԀAY([L#FȀU 5n! LxEsbv y߻)v+ 33ދtE=PZh 6Vs8 ]w5]9.2Xڏ ȧa~iL̿yaoj2!|gJsVT[8"q@dHNAûCr cqrdݟ OPAU3vGicj8nyt2Q𾐁$3&'2fb.)Qx’5G'"8i#柜0>">'gTo э|\eK h YeGX.Λ8`5`]iTt좴>L۴,faYTd-oΏF9yC~tnn1*AqoE`SrmYO hY]\d DdƐ%%g%QS!wVT2 '¹'}fJfW'w9("-]Tꌭ6v=F2o>{i5+1=uO޵6#2[paA9,`= ɶ5%ԶgChIRdۓx\͢#bQBK,Y""d'hݴ3c4M?Kn.@u]+W Acvt,]—1fqNi)S4F(W'g7æ߶ |~x٣ȿcĎf=YZ9) rڵ?bw:_}uk /y+zm!g{6s6m_UȊg[ u FQ8n5H]`GGG6@7/[yJVC,Mܹ$^]hweZ+PYD~Q=v5DmU16.Gv6c$PJ#aĴx%# RN2BN9%iE|ɣAwd2[WX]A|񱌥S(6ch .5=@yL޹Yy1_+ ឌ>Xn^eWN뇥?Rx%n^e.5q;9y=}͹QEhHƓ#{.jObjY5WʄOAP*2)K\,y>PC0hHQ3|=⺺\f{-GP`YfԩhpB+Ǎmc4 [yU$vT1Z+ukp8HN*ckr;<T"J0Á@%2U\?d^ ?[}k&8(͖/?j93.Q-̇GTfjIe8e5{8Y6#$+zTyS/)ǃ_r.t֚|\iu!@l?YR1Ҁ%.Dqg(YN|@q(g8q, FE0U_6wvլT;(0uC+$;&dI3MuWRGq;ozz;*VԿM{\G^ R8J( Gq:5h r+PcTVeP4Z<k|ܩ=ПB2FxԋOiѸTk §'; N]*)' 5 إԨ1kɁLfr$3yoJMuE}ۧwѧz z緶qʙ9] OY 3Q P^Qۡ0 Q4|Ο'f_V/Q|D JmP(#%Wj3V"5_hk(@uO_s6;E$jZMѶ_K=L%݆F[ܾNҦz'7PCfD5hV-Ï@ZwBȈ;A]nf:գySp@Ng_jN3M( բ %uV{O۽ux@28qpf *0)gJhVP(%}TD LTԣ|uAF"};H{i'f!l9Dw6bwG$*e~yj JQ!%=[ɢJyݝȵnUM !¯eC|4z ~ܪo kpȔyF,vD9T=/n!Yvd\ .j?B o9bQ!!FXp$ 6o2cOtZw1T71nmP+@cދjYnY"ZXA|(o3F߃!mg5u#Iq(.y5q_r~z)7%[oo@G GPь_&HEXY  +6nw7B]i 6<[pW=. qyF@۞u~YjɊUQ1Yפɞd-7WF9 O 8rnOc9VrO*O'$t#ue@CEI&)cka-b9J }_}A2 3s/ua`ydP;A= 90(PɌ "deP׃2"qg|o6?6:L }|idwx Ԫ{2[Uv:]-^ST}ểDcTʍɨmB 8mmnOc, u||>!A)fb-.uJV?Bv8ΘK$)pd2in⒅@0qJnz %J1ROcKoTWDA2(GZǻR0D' 05e~Wb(4@lf;.x 1^"?SNKy!#aO@F'CY|'JgLv!#қSȹ>8ROGkm>C B ~p G-އ՟8^WhME4BTK;s8ZKk 7Vzd_dj42elu.N8}6ΨD<T/=A3K~~mOccs!F:,Hb6,[>!"ZOcDEkR^x JS"8E}qE}%YbJ/ FQ8yEy%ؖ0Nd}@B<r;u[4؄2*gv\4S9zJJ.$.Ӯ JS05.|_# "ZOcD&cND향;5~vr^[Wr^,[ڙO/I,窆C}ewJr`8C N;\ :<0Y:&{/+x/K5*BJ& _O:W2Es2H*]j+BF1`oQgո <@wGƐo#aO?Y%g8-(/*Tfg]?ϞW~[5:Z%Nڽթ9*¨+u|i'B͚zO+Ԭ1Gcsu\!+hLU{é?lnr] A]n4nU:^BBGkS^#gJD|xw99~a$n;Pա|p0MGUdݷժ3G)t+b UF lO&V b<{.\<dx .&h$r3GPf7Hɣ"5/KhOcP.U)QwM:\=hPz2GVЌ&߶.[C\" W4 ğ=Re@r~ cu߼DklI}qьLxAu\]v!LXBp+*TSgO-hzz*+ntY.7z(Ǒ"tcId,EN,]wx!W;XkpsM4xqMrM >A,cЋőr/XSg#>A5u91xGr.%b )L%˸iKʄԫJX‹n[oo4Buqr7l#ǜRq(eGQ% r.9cDVW߅QkMEo]2'pGt^n|i=컒PBh&"Xzjw]N)d}B<ҜRAkL- Kߍ!`BٕYL:[YaTi1QemD:8x"k yYx3Az yIwXSoM?!wXOc]Hs8J_5ThGԡ?҂&?^BG5@8=3 !ǨGSh?˧Y@լDuE,N !5q5fIY῝az"{ig2Xy*Iz [=?,_ ~\żS`h`9shg1RB3[V?.8}h g߯yJ"Z>eY7;p z<1`F`TA6~8_%:v 7 )#Oj`:hzc򶑥_k`qWjˣV QOJ_{yo!$zb1JQPĀtjSUxM]FL/N6_ʺ UyfGO*'X_ii9xLZԭ3a~Jmk/OQ}zq+{wsvͭeV+J<"xq\Mb>/VMr1zz5_eFo' onoNf2_Tf?bwa1ŒҽZ&0yUoѬzٜ\ KP$Ǡx3:ϩSQ9:ϙ[ /wGzӦLf0& YdR W11&㷓o~\j_c_U]d ]ݾ^MO[ɕp<w˻3g1}j0Q~˿'o+U4FnMJA p5aY0ɴy}3CY>>y"mALjJJiҗ[M'-Nz~$6O47&W]vM׀bوҊ`V +2rW(|pK2m/e|c8:Qܩ7qc \xJ)!()B}gì_`z.)AS.PtG<2EJ|T0V]`soC B #nYL.E漎Sckk ʓ XOtӤ?ܗ(^wt .ȕ\s|A =t58[;䬣$2mcͩJ;%NrȤu.b#ޫ輀˃#NC: V@{$0_B֌^fۖ u՚i\UBQ\04`$Kx*+w?20lw?r0<=">+2$j<^;8s Se a:hS 4`Ybc L`w* nC+jo D˥t[rPd/a?; gU_I@ :TulNb3 @d kFl2hc!k1PJS 9X-=I }  z1Z S{74Ƌ%K=-sj\Ǵ;L̙L<{% 5d,[ESVDxO.w,C[B"Q1rxYc_6MTvܫB]+:29&LɷYY9X~Y=E m~/Ft ~v%O:)|D~J{ZBKroTG]tOS̢@G]&g۫~'[r0eKG@_6bz.etd^ME'LZmF0e8f\98֝/~y4L9% 4)(=qmݮA2S{ -5r0r L$d|ΗJWWd;n)g[^ȨZPo]Ϛ, 8<,vקDwX2]"Sd#HXsA~l[KEe~ 8&JHgeI\pifPi+ˆlYcXzJE|S3;ώEfs4,`/$y'\V G><+R(e8fcd>pI .YiH\Hhah}μ+8|SUd:0i"( Iڅs0} JY2_>KR:#{A&Y2h04A TkS2ۨRbtrSYŇË-:w-k=r`~ FByQPUL}ԝIha[QEIqP1AJ/U!:`=! 6O8^ufgCIޢ>ԩʳSta.*,FE*ٌV7L)$Gĕ2k1,q5OXu\J&AJL%=HVP&A6lq LGw' +-7#/>:@0)l1bi uחWR8&NJΦմ#X9NpZ-gkvKe W=@;rȠda_51fYN8=#kjE!Hvz#nDڑ0E:{`{Wc`P%…dz(jTO'@VJ `q8eAtw`:#C)~Vtǰ= >~M*t jS:U."pp,2"(maٸ7C jjLF/}$ܺY80ZkFvިaUεtP/1`Wcnm9LIct`@,E֥D B L'.iPii}2Ӳ/JS%=ŲN10qᠫP}Rq,c2:91$qD\)F pθ# $yƪ1"02ۖմq ^rgȭ<$k9x!lÖE20y stn99cX=6`pvI-|w~4hڠDj׉vdy7IzY~KIܭ5f8 94?':~JjX _}l)8zf1L.p _ܘ4 ٹD338"'_1pT< Cϼ >d /!%n֥`6-!)֩06- }iZyW jNv`զ x ơ@#G̤v4#As{l#N>F( {AtUGEնX"$$BhŊ֡MEH:t0b=lP ̎4VCv#}Tryek3O y>:8}EErsJF{K7y骎crbY7+b&Vkۇm!1>B&cXgN'Pcq8~VVe=1ȃ;3]d92u/?!%WO㴫9}&28H)4g0gx'k9%T(!xȽFtp/+,O(b9ԽbI"i1 '!LdBGA3_Qي(`0Q|K=3'8&I̤'98=&F3d7ہ8j6[Φ(eMCP4GaJ7tndI\q䊱guaBQ`{fFؽ˰?%'D8ŭ9 7AzKZHgazUfJC˅؏4.N7!XK2L>$5 o?~-Q@_߬:R8in}}f?7#(g̳RmR;(~͡KS#|??]TTsPͦ_nl{p zczVyU:]Da͞UަjT?#cMgLdOtBay@5(<(] לnyjTÅkQiΫlf/>RTw5j1?0^-z2(IĘ#zZ`M>ibZq>]df_stJ3 a$U*cFl3 2F%T `r1I,51' ɉyCMNJ_PGUC;"hrEbYW~*4=GbuJ^m[i~M/A h"_؝d̶Ukm'i>+S~|l?~l*tyiw-k Y}w%]r,XBy5&A<]Q!(&= ׊)IX1b,,lgu7oQK39'6pL~5:_&'U\.b*2z-6'6ma?&;xՒQH#J|M⥟>\V.肯p1"1DA۹D R.j:J c9tect6:*тq$@ۤ8)uޱrnp0~)Hާ¥^ry嬋tk8.X1. (QE |$Bdpw"wХD]uvhA(tթ*JZ 0}OHa# yTL=.MjmΪS,=U*Ѧ.OYf"zOv9+ʨ(X#^tt] 鋳㐽Y.ST= iZ^VogiqNUW&^58dy%ޯ]MgKCDM>jK/ 2?U_嬝Qp?_sD(bHKg0_.e<':)ϼW `RN¯1Og~ `2~y»~WIIL Jmg t3ޏ~YV'{+=Mgj8Mzq#/Jg.H @1I  .jdaJPBD:Iն`P/Uesxsv8 weㆻڅ]sQJ=LD|BeEU|JI_K M臗Y|,pMk ʋhBxo xl'I`Ž`)) 5#F~%|WWN},a%]*6un Ab4|uv(p Du04,Q/'Aـ+-Zsڂ$K7 숻6KlЃ%*e1`/ "Fl-R~.snQ&hsrLd:Gewxp K8owq^ \#1j$u*ķݬXt#sJ?myE V{H[YJO-I'6I0J0i©V~Z{T)=7w. [ǐE2ŵf]i.l8Mo'Sls.Ok] GE`>*M"B](1ggZPE$Äа'IP,dkC<ߑ%*dblLeŜ"c{-(!|"`TwcO:X(1B\00PM(.0Ղ*RX2 ǂ`& k9|Ҵ'M{Ҵ>5ְŽSL \Kԝ^7qyw@ ާր-! ,tZI7M1a+EJ -T!mI|y{B;zi!\ $GV%EGj}ᒫ g#0:l 1ym A=("hxb{ zcW )Pbx{ͥe؞ ^T,W{LfIb5ʁГF0A>1~PtǦDDŽ1Y2D' $ ד*eמTgT͙Iq@7/ WPʭQ4` z}gnMcؾf<*Ϧځ1.Wk%y0a1NI,ؗ7 dB$ M,?6r»JG2Ϧ;9hw>'ܪAxo!]H)*%1b! '42+Ps$51uDXhCT$w\Uqwͩ"m2OW2mjN_Fmd^i292ɭZy؜Mx_Zo^e)=A%WU0:x/^,թj1# HP-BOD1bucdZ*yVG9+؛wGvI~ `Kb/A?%PLR[|H}"x(U/F;ǸDh#<*% PG')̙g^gա =oI.\\xlSKa(IOx֙8  SePD"OK6E- DleiӒ/A>4QFp-#*mHS+<ܤ`Qa%_>E;'  "gIh֡raټtpg;[ doi/'|TĄ~"SE  0$n4m8$fXLl^Vv My, ^<~xw35JԜfR噪{7 Cubϴnx ҲL"`l狍2\[ϫeN2@je[]I] `o+oBE?`܇\hvx.xʧN)K)xCDrY#!Dc[ [mMr@ToפTx,,q+mژ W ȍw X?#q]yÔ}3I.Ox~ۖ8c;+>\Փw P0愵Cĵ$q&&q]-c]Z3]U^CY60%o #EZdhコ ] ծ٦2YǼZZkc3gZl8GXζۑ'B}"4ݍ}DB(wom42A 0%) d!s%N4Ds]0kr$XOze=[zќr00svg?1LJ: C`8/X5}{M |Iqd qm X(06-Ԧ*BVWrb^ 2!eLXό'!\b'2Lx=qh})?URah @IB?ѡsrgS啞 |+9yP9 ܥt8I?oQcLJQtE96v&+9ŋ3R7-v$LQv͸-8籖[i1HZi0©yѯ.'#1ܴp-'ۇ'ӓ5e\R,'ȍ6AFCo2܌ (ev3ۼOo貅G^|ĺc>ց"4iЪ4<g_~]lwKm 09A )G!@k<^N~8[73}$7Dh OI% d\z`4pd$QZ\ W^a-|9Jhn_ s/b{꽺^a:v˽g!7JKK{B:yC0 #Rs{t˦*ɚxz&Nr糅<]lb/ŭ7" OjşkA/"KoeRnHY?ܶk/_& E\m-_+ \O@t^ yE~П K^ Ё_u3VIˁmV[\R"cHMި {&ԉJ }~(/. `mƲy5r9zL ܪxf*޷2=v'͟䳻mWm d#9L`.jh`0OoHI*j͗1eVXh[CJQ:@lJa¿pRTm=M~zy{ra5U;/jʀa'ƀZN0&8(dT$:0vrm-ůDdS59*OJ( 5'Z{L˻@0cQgNfɮ>|17sUTq1 }\oLw0Lk^fMK"90/Qؕ=**Y]guCt+O5h>Na.0Ai)Xlbx5ĥ@/IۇeX"}ɋ'܋kndO˾d',usL]Z&Pf#y2վk`Uܜ*>[ txM7 ӶjJ瘵m^ bϋҁ%e[Υ9[m.QEkr:43&ࢧZkɰ9޽#\d0}f֍ġ닚s.PۈsIHEWU-^ Tw?Ѩ\JzFAijW?!fNncgOTp,щ+)qySp0zaz7^DHDǴڋyQK]#-OF e].rr)zIX:}qgsBlʶ[Xl0%Lhז8.F rZ.6w]Lz^S9v觡G+)]ËƓaX=k'iKY~V MPts.NyJ܃I}T:pNz콞;nAq`(7/eU/C9 Nȸ;r*Ψez@nWh=Y`Q֮fId_ugV͞fw8FzzŵfM 6@37ڗ1&0nt7tȳ7n1!Nn)mHK$aq,Z,s > Fk= [Y H0LÿbfIֹ KKpN6 a8/ Et)"M-AЙcTJ| R=@ƍE0BT>Ev/YQsMÕQEIq(_[i>MIaT7U_{OeLM{ Չł"c@Vrw0MEP=+w? DÁb䃖sڂx aڪŹlj>I ;659ǪW $A70;%i{M2N.8onsh[A8&SEK'ibg496'e)2 -Z4Th(#!T9 l78ĸ2Qݐbu<%5&:I 7bbKz,2x=+lj!a$h,a#S(F%$_B+<޷4h:-¶p(|pb8ÕjY)Qe<|xՄygcE q@gF)892bVfn̳b)Ff}zuX) Cg'iHw^9NrmHRsC׎Omsظ Vl_=&+Q?[mz~4;:98dWIQǠ[Xّqb۽0B>{0Nn+t6\s=q`N˝q 5PٯXǏ{\M\J('(Ksfv}uE6+9??!xboI竿B[zf4]s~~qta1v10,̅M uyp!##/|g\N&Ǒ1ĵI~p|@t%uE;Xk): (_7־ [>gW[@ZLO]E7#;~ ,X2|rKp~/BH6X2]eC]f?@ퟸctzCx_KŒ3yl(y9V&n&Sg{0Ed["6jx3O< k>:{pTWvcy(%ǎsGݖ6wpѦ=ꓴN p)Soĉ,y _x=qh}IS¬|V 9}ȉӹ͚Ǝ!Wm*+k}UkS1d#4"KLCZVYţ'ZŐ'?'nrOCa94BW5^9>~6>Z(r(̎[y o)*㽗P ]QX@m1S  GBsIt*'ylb!TAŸF}wEH@ \㢪VjD.g#V=;@J UjR;HAR[$Q :l-фqLHǹHoEC$J DOHx]XAp椌{ K.`QdUA2K%g7HI<2.rB'S!t)6R!J> Q’skmjRheg uRg u@Mؠ,]RoxF}+1z%dK4mEd|= bY6RSI3~2 ]~2 [%fR77MͰlCw71M1'2%F1HozH͖S14@#.e.=p 9E/.5H^.5)BB'6SmqB5<(bضe1hI)<ᢲ 4GVplĽ@;]B 5:;m|VXF]?xjfp7#.QEh:/\n->ʮ]ˮ_e׵tLZ7m{VpœL'+Nxz²v|=6 y:3%t;sՏ54ai\}1wd=TAJ>: ]ðð`\=FS~?Zd<Ezooen0V`?:@"!e4Ro HoQ|xg4tWŘUW=@bsoldXuT ʑ!Iu4B|'KMK/C9rAdVYmķCA<:;P\0&1k.yDe 66!ly/JKJ2]5RQD8ڈoxLS|Oh-qp"3=wPy-7ra Y(K{&PK| Uw; :4W46xb$ƒPGf ?\0=3 =C*A"˥iwPyyPTGCbUR"hJ˜({@Jړ~|(U|pLd{Ծ kfj׷`FҀBƣbx>;Pva,Ap&DOd`D'-d1!=i` Uwf^ C@@˒ʤ"%](`Za3Lx2XωԦHTe_q K B|TTHI)&CJњrtPyTmoޫH 5_ Jl\aFi{vPy]u/ 5$y T% 힬  NkHEZsR,Д(Ai#x.!ca.qtPymj'6˨AZL&-IHK։Pd6d͜ tW"kF5R6=6߅B VmKJ(`qɂ6貇0LmW5H@!DɆ-a9Xq35 2]elX6;2zH}Ҟ,/7x>';;}|5"kr \7%ȥ nrYOo~3Fq$R!~/sQJfYGQ2 \+DRYEF.i.cA8R1A"4WT屯D )EYِ0^e) qZ\fgXOJZ`L)Ls:Y-2kwdi#lJ50KĬvmٱ2x$s~+ڍMFmUQx@I&xD{\{Ud+<O N];Cɚfļ`)En -9dt B55 C,BJj48XnHq((*nsB8Y͔j_ h &MHWIlgA$ W`}IޛRFx# WO xd~6洍Fѓէ ijܑ $2oM*#i>N1lTvZ ! '@w62.qI7Έ g~Ӄӗ׶(N6e92)F4TࣉvU%PB^ΉmWzwAӐ&41 M|29ʘI4+-)`(X  ǠXɸ6D{ܳyG`nS!n㽜$?ˋut Y̪e?Ç'7g'tAM-|oSHNWI?V]:|7Nm ~ܾ n# 臃szIKo{ Krז Wd7Ň?مՇ_-E*MmNQ;9Plӯs[D0>M~\/cc垀X^W6h|7xǭO&NqB[J]j3H2 :ϔy JAjGm)XNNC VpV+[)#xn-,'LG -%%ҳ洽/-/ #yZ{qwy4]_:ꆪ;`r_Aӷ'ct(s8?A%6=$x@ZHc %w-@|64g3g%BSٟmujWy:(7'e䋢l9xe2nNA<;d.mǷͶw|৊PK|ggߛԟ6N>%W? x^@J! fWv*uױU5EGZ=YB.b*'RvPxԺxi *#܀*QA؆9 +ffRB֭EN.(Rc_H]2\K^T QH-m2DV@H mv  a-$CDkVs>g偤UOp^mH¶6+왒29s5. rU }3%9kl ZW(MO]K\Jck[EZ}72oen}^=~u-[|= GZτk6>jƗӘRZjTt2qK7RxD 3JhW溔Gi2=^UVj/гGmਆBi R)ess(`oLװjܱohOG|N(W$E*rY`Z p< U* Q*'W;N}jKzjhQ_-R-p*wUT9-ve,_|{螂Cx|heSEfӅ."K4'-JEs9v]Ln;xJ40&Fa8@tL{ Zޅ't Qz4LhJyu4C!i;zHHT~{JD*_"EsZ6qH!nlhCb*xA؎_ 8LfvӠ~Ń{kc9  f͗v͋fd:$M.4qo+0%$09E*A l|t;5JU8,d@`Hn;ִn2h >%v05z|AMo/.[[ 73̺̍ 7'6ۉ&Jw^5}Z6BCW RЦHY^ۘ=6]\B3.if"-~?cކ\D0RӀ?cĽcDpB^:45{hmx.`q8EGY#Zگޫ+66.(⪔f&iI`x-ÖWN]q5I}-$etwrb?rd(+g&N*̍~vb]nQ]*4X}vzʝ#Ŕ$c,>35 2]5lX66Oh}癇maB,d[Sow{ێyݸs~ vzޥn]L#J8QdoFZ6x_Wod ޕu$З`v߇` 0"`3/٠OI6E$%߷GxIj",|~}Tu^]\ !QEqx . %,F"5Cbu?qSN#{2o3)}jȭ# HIw/X^5oS\ӲZW0z$ 7)oW~nG>/&Y$i,52HL@ڣ1C39 1\Y.:;VW 7Q"*k/I'C\wL05V&B񪏊_mW]@5\za?N<ؽG5} -say( (}"E%( 'A XX8DL!O(1D fV,Oq'PCj}2d`;w}iq0sԮm +;㒉W7}xM> u\DO8C x2٨YR8WbZ^6zz?/MpVhc"Q'o3Hh!YbY#Tbe>2UkSt5^|(]F,[nn"n#.lyK*[$]'MrvIFN/ r,ytf0=a_2?έQۺ`tnkW^\ֵvq ӞZnp@%m':qj8Xߟj87 ҜNts狩Vh^7sU ?Y EسVR*D JJ=p-9 @3sK%#1,m:wJwoacWzaja-82+(莺d\To"#W(mrnC^Ͱ~R@)E a_kk=CZR{`0H8^*e#B#7G=K/w#$J!$p$F(VkB2N[$YB pB3MXs[>du'o ?/Ǔz4KF 3(\`RyɝAQTIȍL>&.q,)R8, wZ0)e= )D%T{ >.$d%8Kc'cFP$aAq2#QFI{+f>5*UN'5) Fŕyoo2V,*BUj?gut?fe}~XgT, Tig%a<+GXJ!"Bi/=)*uK0st`υݪcv9j+4 {t$0Frpj RPq 3ST+h4IS(X M)41s;~ݕSUҿU] }D(TUX*P+P(\ZU4\i^h߯=&FH)S#ej Glڌ3iUp؂ize Y,Qh5My$4/zmplC.2bB' z [= ѢAYFQ>j5zָT9Plk_3tl`n΄^8Vtt t1r`WeDnș|"=|iwz$†$Kqg5Q ⨱SV-姛H];zˆn/GyGd&t(cU/ZF$ uZ dXJyikOvn]IhuvPWn/6c@ma?D}5A N%) dȃ(Xj(ytg͋?f#Kιh2͊ҞX}˟wLiLXLowqB/NViR{۫[;nޖ7ކ*3DYFxi&scƛW]o[?㳷[hK@v~ik~G1rb6u`ْlC"L vI2״Ysreg$3\֟CJ%͒0Jßwßx86 az ]Jv'=:M/m9[~=|su|o\1C[߉áF;:'͉}[ammp}73`,:QmJOiB>xz(:qf< k;`/mUi)D3Z|R}SRU^P&fe.Bц(6@Y9rĊ.I-TxFx,;14SC=tފ:(~K£@KB56< a5.A\3)-:nOgVYq ř%E%aG2B hI,OX!(愢Dÿ(4RϒH\  -G078J#(ATg|rru=sYL6xsff\3@EsZ UCJ\<ݛiҋ.ۧɇ1,ˁ F\لn$Ұړ[fj>]Zo>{_0Sǂiϩ".DXPq I-jsz ڄMiKK}ie v9bc)اlhmĔCd!b43:9 ك:'ɑ LenU9 #hVg=FQzG?^̓F ggm'n,E|I6}l3j|3O\?=/*>mj^P_4#[z.׆gkp dIBxC}}V`jy,Oa<<`<׻?4v{k<}tO{mp ^]4#Jt.qPcf4ji`d=z5~_]*KwKC-*dӈ6\KղTa5P!Y`aE4uR8L4b Sp!%BxvqᔸC9F4P4^.oAܟv3z -M׹2y">o^lt1L9ViC%O%u(zBNFZ!"9 FƭFxI.:EKTx,0Ub+q qNǕ\=;ZFM_uz^"LSI"\ٲ߆#=K!A ˽j!gE8O'=vT(ŶLD[Lzo:j1݊-ڑWc[;4_kaͭE+W 8.6}80~xu5dGwmY_!eLʪ0`,ImiђIlA}$/S v._Wޡ땛 BH' 6(嶳UU-j،ak$W&as޻MSZW̸3Om~(Of_i{ڥvATF /nM٪M٦6;NPBgZ쀆1À;UiAg=ϔ,yǯ;}絫o^Uerf卿ie«fh Yw~.`vQvBK7:ij)FrgCC]֝ٺtNZQBq`M>*r ռ=ULEb_WRxy8 2l d1dI&;Oo?v>8R,ePK0m<|J}ݩiWr``+k>O,V`VI3B=R#K`a4n9^k7԰L%d*|!fTL9{5KWoQ7Cy a>w>097zCg8ԈGːc"I0U_шr^+15_ӧ$;);qZJ"wJ8/RKQ A6rЉn0ir'=Uγ(ա`};s2cJlP瞀HQ`@96p<'SlÎG1FSϦL&Wsi'Ŵ[nx1x~$ 'GVWO#W㪫QoJ]=JNK]'+Ѫgo!ߏwE.3_(^K-B,!fAG_:^\y >)2#Tq<}xG/jj (=fׄaoX^z>3qd+@\qȅQzOoD\ KP$^\Ͳm> o1WlYJhi>y+K`CwLG LI1Z %؀^ FVlNF 4%悏kP'?Q|Z n쨦Jtr`21hs2'2>bJ3qYh\T֒>h[荸mgJfq9@J S3[ř!;,9M.|vT*#+z5%bN(ngd  gI$. Oꈣ 4q c--c,53#g3άgZb!wkyh__Zz:`fubZz,j>U JuzpE&Ws jq@%mK TwJ}ަ@e[suw\Cg=aԱ %gsH<&F &aGNh5D,'6aD>Z'feC\rw͹l&810*E1P)YH:*<0͌NbyV&x&9sv>qOG-%+vU{r ߃ c _Uwx>6pm9ЫeN#x@-ra,V֥i/ۋ8dQ3K KtJSG1.EDc} 97!xN1db^/x{sLNN9 +"vmB\BΎޏo1ZY<9ViEx281&dz=Hs i~LkB<.h>%&$b׶;c:OZ#Q!R 8Ż=;Mgk4SwSLam_nb b箟m@|e,B1ng۰ȅ.U[_mW˴R`9=r]@Y(9tSr64-ڹSƶEZGBZRz;!w]oAeb/ ]к狲xVj#ZG=ܝZaJ[Hu$HUISj*GBm*#A  wwZ>dpI[٣󗪀4> \gG^^-rp_G 7.ŏp2`x2n}|[ċW^C%]둸=S.k妕xb҇UL8/NFRX RrD<1-#af3'--~, )5/}zvCry:rYV2Hf+`O09~ɼ,TLDE_E_Mlh~^oxnP E oC' x6b[CФR'=O&-U.GEygJFXVTϑ.0'~cmjoYAxc./ (xӠ_nxa몓)ǡmB퍢D-{qZzSfw.q]O4p qrQǎ62r|xs[6[qlEF;tŕ+OǚC-ư,˹`M~ $RѽM^)+YEw~)DK8Q-`;%FL)Υ6:9]#tX(7x>9*ٝꋏO2#zM21 OHsO@nb`$ BeTXX8DL!L(1ZfJ? ҷjt෾q9{~*ec-,| (rVJ:=/陵xOQxCށ9[)?߀㟻U!o'_,E>g>Dr|ܾu6%vR;A#Q=c2PYYaș7xu~|P" ÄG0_NFxlVC걥mS.tY~x63vzy[~T|Ma\;䯫' ;FMvJ {RNyzq?: ^N;UQWC~8%5^v*W=U 7h{]DO8C x2٨:Wu9&mm⊅5jGZ+%IHqVAqh)aA5zsb1J 411 s[ <pF%0YìN*Ge7eª%<E]j`9ЪMvmM-&ܐ5[fiU9Eef7L]YŜ[3\KZYu"S6mn]ߓ|<&.yj>c޸i-g06r=?><ݜ}^<h{3ωg56 w;Ūu>Ěן ܲz*<'it/T?|f>mtFEܚ%(3:T"R%%||kqgw{Mj{?U= Tig%a<*&B*,D" Ucc'wç7Vg'`Bt&:9:l&bZuX7w{/Uk>׎u1pgj!NZ~CZ8|4 WYo4O `>ݖ[u</}oϻg_g]y@mǗQ)3fLIUF0 x`FsKi I4gZȑ_qi6 $0}٘؈9l^6MIj{c&(J$E% P0Լ)1ARbU$2Z&DЈؘ7簊"M]rn"5IBJV+ F@Bb/ܺ"e Lsm A1h{-4h>Sь22̣Yhhe%% >:2zBlu:tYA&kPB%Y"l2|m^%( l2-tl&z״i]ӎiְ#(EHAl,aWAw'9; ' ǯ}i ֢i}1o P2Y3bΨn%=Ci3[,`= !i ccl,4h%I#)5m"r:2^2!+6!kA`ZO0zP}x8]Nߘ,PS7Xgi͔t]L98vttt5bC (\ '\a(9窴֝{ι*s=9gV2WU`kcH\r(J{)2\}@s$:=$se\Uq7Rj8wsU4]seͯOn'Q'6Wiuxs,)Q2WtsW/`dN\*0nts͕TN!+XpU衘+ٛ*\}@sU 7 sUv|0EbHZ+*\}@s`@*0* "i:*n>xK׏I?|ffSͬXO꯿uI3.00^0B30*x,Sӝ@9 j> 18 1VFhmtNEXF]x>fpxq8OkO@&3LZxXA2^g0pOH%V $$F{t*(-Y!TZf,% .H+«RƱE&F$}=?nNYcIy/]VxޕMoU;Ժ1j΅tX`Nhr2M ~2]dCLk'z)YIՄ}u J/,7,ڎhwYXnǦٯ)*}ڪ םøxъ7-36I|oʼni!^ͥ9{pɨ.'痌:9iX\BrA($w*}@ʞ4hRpb?nI>|m}︜h=ԏ- ۝ YހҀ@"*X # xk:LUʋl"ȒZ.lx9|L+RXxɩ$4EEG'){ܟ+!>a^6>6BNL_.ּ۬,x"ZhH%.Ei6JRq!Ѧ%Z97YX] Pl@DT%rXsXY@WWco`cseQݼ31KkƣJ#1!Ż2< vdlQH(@W8SJd,|! T^|ro"~% 8i\C89Z=]@lyu4kд]\ ]`-m~aĄLiaC4k@ts'Ԯ.7"Ñӯ>4 g]~n{eru3|{v;Pp 1OUv~m{;PhcpH4H Hkvm3ڀl>j΁%sEkI9靀\ʓ=.j VuXuVmͳpLpԎCA*%ڻE<%kSҕuA ]uWuW1_l|uqQ !B\Z9/ >ꏵ}wηtZp`~/ȦޗMnL9?sT [Y@(ĔsIzX V:Qc;ԤJJ%iIŃQ6+ȶg>?ݭ>-3aL&V{jQTe-e/P;g>Mtvw|D3bvrWU\tC)wE"^J)l/w]-1{kGӟ- *,7A.N^85g7l]dMd&@iIU:Db ArbhoN6qZϕCgg-[|w׷]3~L"6Q&Ib3D FEZ)Q|'5 VOq?18 sK1'ڎ\i8!][_bKȬ%v"}1 .ejX12DD]e/8# !EP;1yI*x' .Aʤӥ&**Yp\~t΢*ϯٓk>{i6wgcٽ&AZ l'].Z1}v\ &{ ߡމ3ҵzbInӊ(f 3gו"ypi/kzeM.I˳!dm-H FXQgxIɄM1gх s THч@ARS`0Hx+8O䷪G cUֽ>[۱uw:<G]vWYW쁮Wӻ/ !=;%^іytW t8닝JbпX* ٲu꧅v}-r {h>ʤ[=OĐO{@ǫš[ǣO]š~uEyjy[]ZE ֹi9 ξUUv=\Bc"gRfX qDp>B3R2;{?W+j6ERT* 5C}\0scAuBʤSЫؿfhO,γ%̿X :s||;o<ʊdxfQJN E*O% evJ$7^o[56g+Tj,q~ҹ뉸txseɏKׂ6a֢P6YX@DFO*.:t !g$2.mȾ4%׋ZQ}hu t}> GYן 5ixE07N5wXWcr:ѣ}:Kg{@;\@bؘiK҂f&)IύqHȯdoe5.P<<GGaH)YWZԿRdH8J+kAA#nccL*/T4uɝnCfy$LH6i|AP6ʇw(U C[ȋ8\ִcɂ̘:oo,JbE I'GT\dK9aB/dZ[f Q9$gSn] ڱDُYgـf<߁k!Ҫ$a3>IBJV+ F@Bb/ܺ"e Lsm A1h{-4h>Sь22̣Yhhe%% >:2zBlu:tYA&kPB%Y"l2|m^%( -tl&z״i]ӎiְ#(EHAl,aWAw'9; kl_`BdZ_[A(:漵 k3*![IO{ڔ02XBADCٻ6% Cx"~& hS$M9xHciTw*1+ R8Σ'h.p%lwcJ#68RojRz\Tknl\RWHFE.-edږRqɐeƳ.zA5 F"`Tg NrX lRMO^`rIQV!c 0U -bK(hm=3jCJf/h D3@Q Ƴ9x{V չ>7fsi=&ϣOݭG[05]1^=^劣'T/,)Uj5B^/QqW,'er^lpnx}=$&\;#@O SWZb*vaQ'7\"ͼ-YJm+w4 wo`Fs,&jznzdΓbT9UQeM(&LJ4*5AR鄡".n8{rm_0.ϻvyka}^*u$QK-9+RĀDw>K~r\Bɹ`1X9qayg5N|\Yznr?l(>-$%*6);^YRϖj"4ۖs_oD!Ne)WQ(gRTW:kyfи&(͚&JYQU,)GoW+}-%v#\G&/΋k;dCӠE LqXaCF-EqQ"|6[a<.xCѼPC?Aыo$خAW&B>!T%]=(ƐyA´P|*@NaVélF=TE1묁y2dE}LWLB![+VݺboG|;}lkD7U$ Q ( -KMQjzg`ZHzrq8Μt.|Xx9.?m6yUA"ԡ2_jCP:4nrTXZS .d2$JҪJ ;PL h?ߠ %WDXIcN2 Hni8ESfa8FS咆~&pKIR=٩TAwvgࣃGPboBDo%u}D2͈8R:mt42Ť#ץV&{V78Fa* ׸~^^&X٭ӗ;}+_ $htoⳛNz= -{vCmۦѴm=xW$/^{m6}*'wyfAurI ,Rg2DOLSJ8%Rb%0lrl{' yTe1pɀRw{D Zl& fiټ`ˠ73o;ae;\#`vy<\; W/,XPPԘֻ{fɎ>n-lM[j$7蕶7(}oeV{VqGg'M)*eRT]$7_5 .;y҂ND@'/jfwr'OPԤ;q8f8ɰ7 e{}53yo&QFϜ&`XaBsH(oQ)^u"xaqHjoG፥hѝnǮG 7:gM'N:~SHCP#s:XYvkLm'üq~!KnӢ$JђdKk,)QV8dC"||kN9c_2acKJ1RS ˜A&A1M c+Zn0AC:Ƈ==kF }Lmuϱbt^ߣ9#MIG?Su"$b{<ҧ/U?_OyUSWgoɯ#LE>9\_F'/&aǜ_~aj}XjkS2S\T D)^H =g5:{| ?.L³u4]];?IMqlBjJϦn>MQW.JQw~<*۾]iQNn?Z^ o~jJl++As:;{y^7_3RG q;K/:`DpᠪoUך?!sP]%ej񹕒l>UR˕_4r{X{Չ}J8J|cEUI]nxKc:[r[;f)wTxWi6E(X)o^fTltEzJdn*}c -'kvj6mnD̀}^3zn(oMh*PB߿ڏbvb䑤B`',ġHf%NhqTJƄtTˈ4' ]{%txLpf 'lJfyg9s&WxG@Qj!r1An=8 )㙠1ty9{&sTrCbNj-s%Tta NtB-` Hdg~Tc▱/.38:<e,`ti%Jb#V9:ϪfmzB )dM5A0ɵ r B($<3@cLWL# Qk%i]I?t@fz^.=?Zʹ-b=,|T:LsraF9(fE-J$.M8$0\WQu7@[WJJ"6 <(lxu+P`B<A*Caj3ւVd2b@Dk45[!-grZ7?KUwu=Vjߘqڋ}};iWm=:ܑZNTir3m6fvG׍޿hZi{S3 [IS%ʑI@hwVjrd_۞M:Zu)˂q6o%;9n[=Hr;?OwYyv3x6 ,Y?[0:};kon|9BqJ>BƄF;$}s|#S!a 2 +ލ ;`#wB!™.cBwz'RY E4u3+)ahW!ʀB"#Ƅx-oҡdV$ܹ43 [ ecMEIE9CU)J3y#H(y+Ć"&4c7C]2B٠g PVެyo|)AЁϕ&ӯ67ЂI9 ;jLWʗVJY2gl TJ#– n;sXwIB;0w L[A֐). ,'@:Z|Fca퇄E.dnM})oȨZz9E1q fSkZڋ(U 'yV;0$5@a9N`$*ŰBe[DșLx54gg'N?{dOȖ=]ɬZyQFK,2Ă ,ׁ,M $H8⎡h-Jxb'8>jK ^Ȍ ^`A ƜJ*fFn͸EfBYN",dj1qn؀#62qd J?{3Y+P\3 JFB:`^eO ) Ӂq!"'UleI;`xN(E.qVqLޕ6#"i (C`1˾4xRInI.k0}-ٺlQ͇rٙT*dD|󨐙B$^ȴ u!@vrR"++"@jպ qQ-yZD)a1dqKkdN[H$D L*%:&y /NI/D L2chpGedHhnS 1PKw 1AxGU:J"g3z֏md+lʘmqgi6 ZԔ+3(f?m0ݗr5^srHH"2&g 愜MGL.3SwTȭ1ϼ#գl`đeG=zRR/WWzCNm4Ҿ 1}n[gp_]u 覟.v{+lof]ۗLA4W~b~>1_wv 3||8 չ\g] zҪxOO?~0A`I,8 Q5"V%򒷑s.N6'Y܁g_Ӯe麲[J=KŖ˳7UW *PpF&6,)e\LL-S73TLq#+frz@?TQ;W\)ALg#2ڜBj%%.2 z Ln+$@$2R)H=o}JPD֏2\,M?z=آEo. Ӆd4xA}%ǹ3i)ͽx" ʤ'X +,+/?"LHϖ}2v ?5.~w^'Tl=gd oqyS7çξ{yCMͽ>/þb۽feL䚢)څTʶ栬AQ/AyJi |ڽ~St 5Kqb*EY&RaUUaՓ}Vq5ff8j*HJU /rEj*_ &'yڤ(yKhLOP{j))'S!9',&cF!uѢ/\\Xύ$!\.6aP O5Zks 1%tsbl.YoK겦˸g"/xPPn)|>*<Ĕ({Zz<OѦM`Ws)J3L"9P6@8̊3Og*\\T" 8&R0ĕTKj~ TFq''EqI@unxF HZq0@'ZZFb<ƙ٢T.t9r3sY,<{Mtĩa?g*|JsFo]@LūG«}}a%*llQŭ%YfonF?_sSX3`)kLHڅhp`9>HfEy}{%\k1?.њ9pKkU`\ NN09RA0LJa O1\s#SEp/3Pb/$gO|f*f&̱Lq3M6n&WML槞4fl\C%/B+\f:\Ow K^XtB622izpXOG-ZpS9Jq)Ju &J$Ő{!{f>s OCCkGikТoƇNc:wh~/bYb*9*RQi$$q'ǝ1z S*,Ï`oHN;kdoXj#jk%NR^rL<999x)z|<'i+W=_ZnSL{;+#80I_͜پqrPLD`h7UT44(I W;A~fP1@C y9JIarQ1xΒj=+2S|X4cajNFi-k̰"t_fR+3Fs:qW\™1+c1X֕CWՇ#3Ve3ţcijVlmW U&^RKk@DҶ%d\nGfs'Uʝ{nv!諫4OGǑE~<_B1-fEV,w՞I:u27zq`ᙋc7X .]p=Z@ݨ!pL؎y'樔tq~q#^wB9e(N*z-VnCgp@935'cBw\O~xm%f֐,d#U^RG5exor >* %5'7;ǯ{ڊdo,.!:l u{xJ(qC1M: 19x˜'DL{Sۀ{1ptDh@r۹ vUffp9ԻPOx ͑AtMbr)|sͷTB'zjl (G <mu BV¤S::S8\k "RE("QXV: >XTVg]a2r\ Tk$HMr3"}IHp)j&iib5; { e:O0qQ֭?_nlW }u05lMm777ҍ rSJ5k&kDcDjVuSMbgBLqۻA^;>/,x%K[_c mR$v]@Y Es2Dy &g F8]Ω* >2}ތ D+Z0>n)pgF!Kjt>t8(E:*)QWB i8ŸZN?D¶v1r6hY厲^|xs*Y?k~D~В^%,}_qON.g#7Oj3Ƴ£x!J^Fqg-ъf{Qy`@=8OXTA "<‘dRBrhaQ 8e(z^ q,tJ`U Jn)` Ғ9%zrYXlgV²PTY=%W&٧99lL贚j+6gqs}ۃi1+5XHJJ_ %4~|PR-i.gyY xID OL3Mb" bT5xiSVKlJ/Y^Q0.A-Et,6JP %&2` ܙ!"OD$4BhSx1Z[Q^JoJy{a!dz zu[q[ه}OͿû _?q/|xcKd:4>̬ 3o[z*NʆTn擨 b|z]_b:U)7(7fb;a O?㕿?n.5^e8߅`J;??>nGq7r>ӏQ+~ݨs'-ep, jIVO-v?; Q}9T:e ]DNut{"ˆN=cӟZڛ>cwVξ`fw{S܎_zdۑ3%{=j >p7Rsq&ˣ^t^Gu4SLFO7 @2Qf`W^jr|ZD <aM`D+±Bk7d!2m5s*25NR-Q;b,[rǑyQJɛ(WG=6iM:5ѝ΋\Kڴind?Iକ]LGld]oƵ}%yΞ<7r~ k7a[.g Ӣb:A6mgC?|Kv8vr,{78arkUWq\H;<rR6#WVDu@D㢶 @8et[^i%! dR)& u>e)kmcY*v?_8 E$l~J)iv|ɡDJKj>%kf+)s[ˢ!I &0(h,2DzIǿ6*jB6*b04F_9gQ.Cp͚:rB錘]ŵ ĮfyM}:l;:;B ѣVmsٛb.оlk=bfmnJ b}bWJm\] \D)e3j?s~*8 \gWf2 ^ T}b<`QS%`j/\sAJ;\+McWA7pU̵/p%X{+`>̔ \s \Uri+ɔaGpU *,ob #\q_=Yގ?ߺ.}CgYR${X ~J]D (LjLt4Ґ |L-++wtS0STبRL#gPk+% 3Ɲi[n`˻W r (~ 7q .dfwS ɏrȒ< $LY:"% }2sVsBfJq| Ӛhi|89XKooKx')߮ۏZU$&ڰlU!t8*:T|J3'w_ﯻ9ܗө2r8ݔ*kf|L.^XHq 8aVLﺎaX\ȲZG!Gk\J-\|Jap8G "maJ5H|#u7VɵYɵG%~[ƅktN?R! oD0+gT iF]Ө;S:p,Be׆{Ƙ9)L6Y"s4VY) NlN8s ~tFWu̘EޫߎIK4 E ۀ#1e\I .]18~d;~18 5Bgȉi1\R¡S@]efv~g]fmWSHIS%-6uiMn߅DT ލ5$kmٓ' h$Ecrp-9BH|,QeUFJƣ Þ5SiT'T2zTs{SV)k!E-  $?4 + 4 AL $^$IrDpJ>#EeԒZr%\Ĝuh\K/2՘=QdB+ )9RELՍi!%BHu),/6S2R8 3k {yVW+oLW3g4"4YA I.)ZKLH=޴2Qۛ>49N>?Ez-]z)Gŭ>@CAļLw[p$kF3n1XK@FGh#1"mgY&=UOZ&SϕRIa39BYh1lV /5gӪa͎02CLfiy.ad)+9MQ\9lfs\`mz3^RK׳~u4O^׵CǎFE~GV c 1-;Qlub[I.*[S %Sܩtw;"0$۫OfnNRnz?|&A\MnAOɄV@3 F|u0]VZ{}&2fd촉aM! b2,rC0ܲ)8F7Rp%G/L{&! xv\U4pnoA3_>?& \ | -wI.jc~14i~y>~93_1OO"k9őqaij:1һ^52n3j)6madٶz~ |Lʾ9ܓ%t/mغ>S<=0 I&&tkrAǐxG-XHYsc&8 lB7},oqoN~v4BT털!ޗtG`8U#8wWz--jޖ9/zS囯͑BBs ̄laL̓924 曂M(tN5uTg%m.=5.ξڪIQQNz{|~U7_,A$扡HNSQb'x26"AplG#A!gyцW@37JIk郲(BFB3{N mb텨U\^I<.ǃoMd&oyLQ(qdo#(X6 @Dj\%?^Ԗ8o.DyYٚC3`{TbW_-9M|d,A"x̵ \ůB#s##A&8Y⥷i%ɴ@:>XX-IAJβ$tȉ,e5Z i|]@#b;9P5fmnQQ=N/|}o"o0!gX QA2*fĕ»5]|T:IQ!= ZgzŌZ&OQ.\JJf4xlJ>eLc:3depVz=W4/b)zkeԕN7=sa:Y<5Kw+p-YbYTW efw?x:;ҭhOhڼXGƮ5 _B.osmdMGW,Xܭn|V*.v}vǃk.μtpw{|w8v~1Aɷ8ts;zpg[n>j5iVlߡ4U#wP"ȭ槊͙ԼG`գv+ւ>^qJi[w+N -;VNrnR\ADtĥ1 DJQʃbc[l<$mFrm E7V#VAQO:eRRi*ef(]_ƃ)؜u/$p?ɪ;xâ X)˞2Vda΀f.4z5s' %WI~`]R~ֱ pw$AUuB+6-mDKm9:p g{1~Bc'V˄D9'eVȔyNok XFS\zL?N8_xbxnO)r)b.'K`?Xz}^n֧ȖO526S0f45php0wȌ,#3G F&-o e7i{!)A4#cJkq= {M8~9Zap#WZ}IU'10@GcM9Ol୍(bVzc_;9۩Գ5QMc8rh0cMehmѢҨ+:jxSOrnI['ƎNÎ jt-`eo-hQ-fV6Eha`ioઘ+h_X X'-ӳ ' \N WY 4p\=6zǚ4nt5iSSo/E:hi8ǔ$h@2gbL.yN1㐬7)82hQ t.1hUg.i4ZNT)*yR:Ez1qkJʊfo^kVdI)&[ղ8-&SDf_C>-Pki`cK=ճ#ⴞΧRQLn(-<8F\hu6mKD 4g%Ds5J $lTyf噕g)4T8"%F#Ie y%ՒZ>$C03Q Ũߤ(42(R6-H]EK1c\$y+`$SaYL& vFg͎?u!+۱jy{_"yd3SF)њ}~lږDZ6>D0sXVY .KH!f윜 T%Τ Q娀zy>z9U3WhfU̪S%˺jzwbŪJqdx:d./.LkWe'/ۄz֒Z6_bj[NO=}hMfTQPni "*: wj[f#)Cpo׵I{ g~mk:f"L# M۟z-\ls7MYA@ӆ)xs׽&\6tдja&즼'f_dyyگhOpgϝTinB`'}}M{nw] a=q88)7.׽/ 67` zʒJO6R%uTSƌ)࣢Ps%YN}sbqvsL 83h-ʜtJδ8ӚIţ ҦǪ ZrPSt 5.C($Ly~=qU>(a0yY𧓞80~l85 p^t 9X wLNhJ9.,%@3e f*!h)[}/>a~O<oc_[3G.T:YVs"%W`gaPޭ|| A1 F7FRgшPI>$E$F|J"^J!"e Ӥ# D{\Ln;Q){{%Mo3>[Y;x'So [5 ,v!# ~[7FoP#nY68zK$LHi.gyY)6Bjp524H+:ӉZe^ 97lsJJMd3BDN 9QA!1Zj2eθWPyh}zkdbsL׈7 ϥѴ+|zzR!Tqws&=["ES6llTyp+Hp5 }MCi艰&0M$\8@h^&"V3HPqj)gA T1H(v]9C"e`T(MnpR9& ogXl~[qQ\i:ey/vυp6jk-Y`q4l36t*3uomn F[rh,]sk3YG1Om-ͤubP};MɏKΞoUD9㚘c%hn&()|sTz,U)zGX*CVjW\(L Z0ӆ{O) |*&B*"D `bRLXg'~%=@X:h:7\ \Gg#Q q N1E`$͸c @xEXWn9(RTueHvK]Odri89 vǛׂas{+ݰj(A8T޺9kuW jnsi GCζTgn}K>~['nrnye'kzՉSdGjyn\ͬI{Nϟ '.GO?}OZ+9z?) <28BK& Gt\c=w < nJI/4`{4tto;o=nn洎R0Ɔ)=/KOoxt&?ilξ`6P z /OI6Q'!uB<A\qyЙ^g%"Kh \eqj \ei8wB)| WoT* îZƜ;\~i6Õ~O.C}O!-ৄg+ ҪSgKR W^2T1r݊pӹWY\WYZMPJFx7WLqWY]=[ܣeWYZ}) $\q-+iBq5m+P}pŕTzp2-+Y\FWYCφ*K)I7WBmע.BDJmvF:a.Tpi&)DB޶]҅㜡.z-~xAJpRpN/מ-[Do/w!E*fs$xhfDX JP\Xd 6F h̊hXQÊ^VҖ\Og'88 mIzM/ri)hS~)uSu?{  B] F9D" 倒Cl^9aM~mox=}nHԒeϺy\D'fzRC>ZFWN5icRM74a迍Vp `Tp )0MΝrm+G/?%`1< Δ4lj0A)Mq6G{Jz5KUfoݯ#Ӿ00ǃ@$SBe8\HfEy%\k1%n8k|ʟN24sؐ%6! PT/^5 >j<( &gϿ[  ~oFƲsƻM K멸>=7ST%[VI U!*hfhDL*q&e%*G9K"UA*^nlހ~"WG2"|wkU [lMV٦{~>/x5~Ӫ-F3bV`UC7ջ#+VU+PՉGcyub0:{vTKgMasf6.t-D3/fʝ{^-:풯mNHM'Kw ;fo[ @]= ù ɖZw w ۄ)ow2=moǁt=ųNY4Yg!J^[̡nrf(jZwN z? XCBRTyIՔ1e=h:8/\h98OsL |݌_?Xq'w=r(a0yY𧓞805ь_9*}̫ly`\#\(ZeCdBuFSqBfQ( )c8uRK)%/:_iw״ˣ\13p%IB#A(Y(mL H] DSl#)AY4"TAphA>%x'%ҁc$sN 0 W @**#cD_W\MHɂы5;|Ky+C).Jg<ܾN`.Oh&7:q5;`~﯒V=EZII2E+$%OzU]}wg.Omz <_#p.v ϭЮ*v1ni0Ù C]378d8/ͧou{.tBzfr'j&h''?/f`_ZoyLXiuCZ)5֦L"z s*;+Ӳֆ-vKg۩*yt-d4xA}?3i)U{'ghɤUTYOV`7dg#Pglr'!(fJH6 Dҥ=YKb0,SFK,lenN0oVyu=4/2qsV $ K+"aDdEaRYﴠ !dўG&D%L{m.$PYFKׇ,&n{ؒG6bZDU"jE܊6@4c8$R\ GkscB1J'D "BhJTHeIRQ k2"Q{LVQy(lO¬\Yl.vQWXI޺܃RH6O9ZݝKgzeMy^^8Ȋ2d5{D1 "β[͇ Hi.õ,'R7D OL3Mb"bTMU~<ξvW:?1%(at\]&2` EVa:x`\$Z?L# \lmtkY?λ盕RH' 1hϑ~\`S6llTyp+Hp5dD vJr,Yg!š4pXʞ29@ mTKDԎ8 eRpdq&^Za!{nqn~- Uy,]3ΰ~?ϧW:YȭySVcco>*ܱoo{qB;/>&і-.ҍ''gw76~ -\zL,v /njmIZ? >_us=X9ݙUn~$n%CuTk;s=`*%;RԄxf콲SeQ`J;- ' SAFDJ!"QX.1΀IS ĩv!ă2Xi0Y`jsx$T!0{/&Yߌ'oo#_IO[4Kj56oWUk$Hs6fD(8Vhf1 ܠErޡ4g8PPdQխeu]]AU@#KFY ʕ 5jĕEm'AQq*qD%sB2هdR)F #hAQGfdB˲,Rg`jJ=!&ji.*`w'с>i@H"N҉J=*&6ٮ-vaOp`Bb2.b5WjXёБ\ZɳBZɳVF'+? SVc1VmWUQeM(&LgK4*58J' U~r_h3pNjti2"wכFYtӖR2|%gS4a+tCpPr.`9VNF83ۼU|\AXzn|1woΝӹ14(At*T f_G$YOS5 4N; ᫫5x7H6q1L;$g LgxV;*?d-?_7pԬdЍá Ty|i%sƖHBDmvT% Sʊl%w P]mda>[f-Wasmcr6:-w?. ʗ@ndxK5|pߛu d3DF3)i-cMZ8[ FRN-Đ@H"j8 ~8!Ц,4NʩR H%r5gQsL( {+sY'ZhYvꁅ_ij䋾|g7-e"ì1HI,ҡ(x8qiPҴ'@#ʜ\(֢+S\-&x#32x5sf"Tnd֜ȸ IƮX3c!X8frV*W,#_nbe4׃m[Ǒ!GOjCS4"MV%+T%o)hd0 3 j-$& fh/#{HH H 9r՜͈m}?\ jg]Q[dFmѡv`VgwF:&X:<8i'ͦ<\ bN00X#88f!3 hG9AQNq{PV&3fkf*Ub™6LIABXi-UW2eʊZaQ(QHLd$&:łZҠO0I̥ޙ1[sM4_:Iɮ3pœYo]@)2~,S^#OVXʂ[v%WqiTpX,Qd eJ+gI1u[YɈʰQ@"x<8(\%J856|5*Cz l7g)Oc.*5xbOjf*H/$jS~*mz]5amWlnUX:l*vuaWǎ1>;sC;774=G 150pJ&M,-C)F*m:P*ľ8 5cTX¸/`?͒]b6Wy.Kׄ}ҳZDʺL@yM>jb w MX5`=g {(C#3Ç]]Gh;H. g1_~wFϑyLg.KC*g40qm0Jt.Me;c4(jpPe V*b1 PP렍P#2|RD8wuǺc+ul8}jUg7/.Fhq4u_q;?!iXKbH gaQ;mt V,,gsC֬4Ѫ BkM̨Kj a# <',%wIPJ}:[s6kR bmg_]nur>Bگ<#y Nvthu0)?{M\cCV0#*)dޥ6>VLʘ} l3l9A޵>H#H9y9*"*b.S,a/N/Bq_a٥zNP\ ݇P׍L&,U$yQ \ ܻhWM}=7Њk66p\_u|*_3 Gf *Cc1#1-)G`Z+oퟒuzU< hUG6Ҫ9xԲόG0:RL!R>P4!kd&aB2(-ұdAy@pGч=@Q:0 ,0"蔏yOkB0jFxT }<8M(rnS "(rbrX+Iј^Sl8.~Hkbe{J읂Xפ!((&H姘uUїm:Sz5dTh`L{SZm)j,v<B\+}yxQ'枵$1-+}[4,VI n0i§ftY5tB/]|O36P`e7OH(\zk-!mm;_+Ɉ0Z& Rҳ 5g_Y1˚}א8ڥgT -yz z] oFݫA~]ꏪ_W~E曟 ډj-ꔌpv{ /(޼^ߎLQ v!}΋2M: >.S/PIG2qҁgK#1/)SL:EJu~0`NQb8 ȫ`Ӣc N`-S2:!^a(hK2WE$ˋ:`ɥ 0B5ӚU9حB(ZArZS fJ{%֖ J =!"}dhRRA9);Ŝ(ars֜-,GgE ڔ;M10ya^?bcn|[6/Xө*{<k$j1c("9фڔX<wQ2Qp*vq 4i6)`Λ [hP}e1I98[zY^]hڥUBWŏfXW`0)9w\rvΜU UxtW=!k)_qٯGahF¦ ږpnfWdLz^~B?*n2Uyg=g4I󶹵ׯz0 ũz\I=C[P(4)Ex˛ZGRJhBo|><})y-Ō;9L[5c؂eւl+<.♃M%uGdT4|j%$uiKl]eU%>#[m|=# Jp܌ Ef3JQ?[;E(5;6bb_2E/3h@b,lFe9ſtߍy?`s:ITjucb`y~oLkjO1>韋u{|~qgw3|{=xo;+̓W(ʣ0q J<{S*j=Q?Qb5-zd(Dh CNR+m lpU4%s @0%xe{&Xoi6V㡏O͇?񀹋HU VM6>aefluU2Al1G<ƮHϣmH~p~4kcYf~;&ؖ C`2RٛYyq=k#R p9&; {;@T"0#AUɔCUsT JСU LX"gK V V7j_}Z[Nh5kK4.VnXQS,b";mv UɨY}?)[{n&FQnxK[o|['A7ʇ#A pGBǀRv5qﵫ=h˳v5߬tc[$q-D/vwgo$]wPe4؁X"8 KNU+.e=V>KCl/XƐWEdSm;L*b|A 'tX<+.ٚ VWJ,Ad9E6U\Vs,XK~;qWڲTNiK_wENmXy#.ϢX[ C9XmvM m>+JUtRzf# 4\\]x/<cMV~Tg7i^gR)Z_t4-xhPXeYev鈦eL۷`姖f}XW0P ߮CwTO+}]+9cԞw}`})QE::&X$2"w_.f>Mb9~}?k;Fտ6W]]{c@Χ'OG3= WV9Du| ܎efh0_f7KKN]VնZ\3$H5[(8aMDooGBIwհN % $TPJ~G~%qXރ&FNYv3g-g7wa`XCn~p$5޻GXZʻW*Ǩ]RRs((H^mQa9yWs̪ QRj#SO]M%Z!IZ#* nT,X# .> 7k5:M_4՗ˋ#6 *ٔhtUrEjBUD[o+H74QX`ӒKL U{ʇj JUgx{j屠v7x*jSgԦG5Sa4)&L% (.r>HK؄G'bTlLЩj"+=jgDE'[`M.*)'(%/D V~ڀX<]gDt#"d1*(j=6!wŷR-sU&eꊈkȉt`uLm8ÙBE->%QNCj,;#b7sȣf1uvb茋~ysw\<<OCR=z?{'Obm!_(+W)cxsrn>%l1ñp)!pR>'-BT_,d,V(  2"cVPm&NAc?շaTں8mVZIe{4ĦGx6=02ߚSq?hE22n>yN:qFW'],~7ߙ<޺U\ʎln[Ov~z/D{}=Cj)Ӏx6둲Ej@ǁ P_qVŽ^wUaU!dF]|͎b͆L:sM=KP4h:) E騫oCȤ0zY\LdϽ{{!#1X[ Y^/5*}R#,C?؏LXNȎ^9dio(_ijt-Ay;}$l#z0#@.T5XkyM‘r 5'Ff:N~}2ڸ C`Q˛[Zh'ۑi2XLkz& e娐]+C5 `UC%8AT&y7=]FFZ5Ҫꞟ<: SH^R9Pj;6Lmf.©vl:uތ`mGg](6xL>}_PۂC2 c#al%Z(`reXur91&3bL6ĒUE*hPQc2*i{av8r廙;CzЯ$Zl˜m=tdLN = )Fk#<ⳋm8ds:LRzkK`sAزP*Kz6e?~A[C9G.ցAѐ[+̆c˾h"c^/o|w(W%:~y Z~0U4dR]88s7?3[]-(9"k0l0i6`dF3M[m{Hr|76B;J0D>?A1s~jL{}nʓ:H---!Zdȃ `( EOWǕ#]ۨG+ցk!U|ɡV県\jbB7y%!3뛋K#m |ivhY hŒ<(G2ًxb\x.;  5P3v)h[a"@!DKj؉\}2^vΈ\"ePդz\f zjV7-|{d׸ MI>)ɪ7QYL`ϓa9geZ:Ȫs8xh kCt \*4FtU jgl5~ZÛ}g9]CAݜ*$aD Nh7yh BHEBBk%=}tLEW7TZ6?^V+sE#w?N80NWѐ\-k,֒H32xWQQ9;yAf@= F`8`;HɡKE  `HDȨާl¨amEpsW`TRJUY Z28!FkdMc}9#<_%nT|:=T/6h!-_ɻ |Vex3&̋kCdMFo`,=Q.Vۜ* 2YUz>su3;#ly^l)~ T(CY8?MpQ|.Ji-Sy O1]6?p #ɏl&/bT3'oAF9&m(BH hqw֏iÑ `K-6ġ]F).xҰى/61Lk 9gEll)s H޲b R%㶘XP`stpLNiuK-Wvu:+ (mO[1Sq= X0_]M;yt[Aˆ{6)M. fK-[v;.&m-op?_u aKH3SVSi#ƜTv>8fe&|O ! e9blB IaK&쪬uwZz9ikM@|&Ȉ-` AzKS69k:O'sM=ל (u:QJR5׊ι*%n*KRuF_wdr Nz6ŇGߤŮݩJnK+ٟEljbS[?g^pč]~bT/i*]^vUV ܷ(U}UU=B^Ӌ+:*BtfJ9 O W ɧE\/@ =Juzԃ[W[e_\!\r/gWҚW\xg `7 ~o)fD|*'|@c\fp%LI/~8t];xz1xpfB\RV鰾X|,-7<:ߏvu)-kkb^\|vu7| SbF;_6۬ods5'wLzU6iPT.$Q2g5?o6wCmV9}ׯ׏ob$?ݧPҾJD#JҦ- J%+5(m4>7oJ.}gf}way-o#?|+a,puEhF%[r[;B4du(Pk JhYLƪ%9OZJ$PL$bd2ʸR],"usؘo0wO 4pF݃touP 嵝3ZWkPk,Q*U*\F IjEƖbL.);-Sy=i*Z3&Ⱥf颚YJ`R:L:V!ZXH-B[vg XŌilJYll&ۀUk%i΢rQXW="-:huQZ1dbGi"3@s9FjRYʼnX1F@T֡:a9I !LXDȝL`ȧ|~ b,of򝡖G$hKpP + c:p4MK` ߕN.G[@˅Ue ̜c $(v5Mkė5XH脼VCԒ"NdiEUHtHta:X0G/> ×7)y<-ex P, Ѿ`ܬLt+QϳW5 WZ9ҶBM5tYI";> Ƈhvm2lwly;hOY3/Vb=^EIC.`AU@@P ||P-Q3 /@ 顔JrVh'/9]ꀇ]xdWRw56̬DK&K I+]K@^a:y)Y}G;`QmRgV)QK(M6ArQ+byc5@բ`&yh%bl9ـJ6`%!ud19MQ(xƻJu)IÏފ"`A-FUVPB~ `*A}o#OOu)Z 6mM:Y{竸P*cbWkYDCX@` aiFQhFQP!mLa#wQdPf1jS11+~}q%JV ]"Ș434b/gfU(=h7,JxKT@&L!Ts ^!7Vh-v_hG.JTrJtPX&R **mz x"`͂z=~1V=ÆfT( ln7ĊPDԥBiRC'W WXޠz׻,/V2<?PeQt@G Y7g0tsX:g@ @ ģ0I1AT6G ܘ%XIiO 4֬U T)JK`KO'Qu2j2r!aR5g>Y YKu~{P|P|Oo%V$NQqHb .̺SyaiaP/ZVFυ4Syf PH g՞s카;Xg7v֜%Lx sʨI8,MrAP$ Bq h+9b $Ti`ːʏH] !Cx"`,0P5嶯z?-]ܴZ]0`.J=W[D Nh5/7~^q<5Ȏ7WKlv<3|ga i q O+_R'r:jҋ ȳB$z'wظ(CZ?d82&=-Ɲ;ҿc>__~}LVοe }q6xS}4ogS8uvP*߸I|w@ߵlICFTB.kPX"QH>#YqyȧA Y#?#!9VQs1l6TB^Q·P~ggK9#NZ-"؍.qB޼ݛ{vo۽y7o޼ݛ{vo۽y7o޼ݛ{vo۽y7o޼ݛ{vo۽y7o޼ݛ{vo۽y7o޼ݛ{vo۽ymޖ"x_P6ipËiF4ϽyTyl5A*"wtڡkCY^~5ꭊ([ vǴCv;jй5ӔNf XȲPə-DVukl!=3Cvh`"PyAڠ`k<Ԯk6^]\>-kO3WK_.^iB)m3JϥtXXH6_W PBm)$jwŻU^y%b~۝速χtНAw:N;ttНAw:N;ttНAw:N;ttНAw:N;ttНAw:N;ttНAw:Nu:@%9 `2/jbQ:ӝGEJ/t]NGfo-d w,L\1QU+2xc,ix8b&'oO>,5M7'L_^9h*t.oN&mvtOf܎^n&ϚɯM1..nuO]#f{\pw>\?6}Ŵ`"L|cÑ?|;oʁ^>'RdB0SNſl׼hA`X"WNt|J\-/n4u=]O9bS201ELRM3)UUҠX1$ T 7j 4Gጦ@sh|i߄b&/C?gWO)[֗6as#Nxuf6,Ώ8'mSo^]L7!~vt{C[z gSY+scÀ=^?=lz r}Wwn:]9G8rf7'ޗ>7ܟ՞]~_ ]_19t8o-~kd፫faq(5sin$GWQɎXlxueŎ=)[MqDw'DJ"@0#XDU\_cD%~!Ye}~HW!_Yk [-Y9a;[x~34F~a깑oF@6ild# F@6ld# F@6ld# F@6ld# F@6ld# F@6ld# >. p Sa @inOȆl8l-tt\݌W>?W]2 3]|8PhsdAh`qr6)xxǃ8G(촴m4:6* i*}QRZW!$z?NZGG-*WhA⫓*!^t=2d:n=Tb{`ncESqW1ߠEm{zC[>xB7XhQ,bA Xł(D Q,bA Xł(D Q,bA Xł(D Q,bA Xł(D Q,bA Xł(D Q,bA Xł(b8Zo?~Ǹ䆯/N"i8:҆zFv486Lb4~?^QMOKz=בGzzD?U<֧D0>7ia4[Kv܍y?ɢ웥2AKkk[tfi׬bx>i-TX#voR֜C7]_}kq4?Xq`|oYw~R:OZ4b^`G3 :;˾@h)iFhl%S/8JA.㧂%jwcZ&U)-Yax5)/qq<fN}j{Dwݚ2nW)ob Bj{w-j6 GTNLrO` r{9GUcmI HRX+jhj.'Pخ[9UCB\(lfS>ۊwJPDd]gݭ9-vXvRm:[mCV/J2ShiPlEZz9d[*EVmlHi!2B]Uhkr"]2(FBUPᡳ=֜ppDf"v/Ed"nDCq'TN)(Yo'΅޺A'lt=R;Znmo/ vq:E.:d4c]Za׏&MpA"8 @}bh$bC;=~=}y0G\ 1g= )r42c\X\`adS ,%/o2EoBJ: R!#-ꅎ܀ztZ^-Ka K [Owϥb=~x\.ҫKz~\滷4E_ھ8ٓ lp1ңgbT`$4Fe: C-j*} я$'ޝgMВ F.U& ~lNKxE8ׂ[e  وJ<J*NQV(YJNF!D`xq%C"([PTtw]ߛ{ᔜbܟ&ףE#p!qO=ug'e.,C,pMK϶58C_O8O5aύVp ďv=vݚ{?8 k Nv|.6Oibٯ Q)C'}aˌ}^aр3Tdƛ@ e(%5a{ {~ZGKAw_hMBlS~ZP;E9ʣ2̃fx^2oqFe+!V?p/D۲)t7ҟJY_+mvίLk'[RZoZWJkz:[Vzr#g:GVe [ ,9'.^Bw"}@똞p#$HVU:j* S)I7T*CW؍`/J1p2YTEqXQB L@G U& ( 5D@׬ٻTZj/Bέt 9:{*ñrn 9BJewfVn9aJ|ʵ!kfI`f ؤx=7w?}wcRAT!I"W)[QV5UE:¡Aqk5 1LJǴʇq9J,Nb̖mO|  >ޟB d-mN?k鱓VIfOpɭߛY{)+g JՠqPoL"5"%"&RV)L(UhuLƧ2)Xr%zR3Mz'1`;|9䯇2S>`!Îmq/f%+(2g.d0*M-g.;/f`gmZ=pjTZ}'-7oɵ?{&a\|$vyGMP9[+ % $ hGq'IzfՌ:֚QEE*JI$4,jAe GXuBʬsz N䯁['&ήù)')(WD۸ ^7O U*"J+W'U`<@!7/۴m$,T;(IХr/Ft֊/Eza}Nt9a..a8/lKa]-%bt/%G*f4, BVjU:u5V}4n`.}$KepPZږGHL81ޢ %8Y]۝4xzׁZ[s>_k~%HlSRi5z̭jZڭ9j9"PKGp6&?JѭJGB@jɔ,wo{(ِ=yɝ@J1c\OeU~$10`M8U[ffwem~$\tcq6<] ڎ3xX^X,9U"YUUr1fAGYrAГJVt1[/K Ns j#c5q#c=R iƶX;,{ymaKvf'o<˻}hO4jxw8b+#X!3?E5 3C*BMDUtH!#30gEBR+Zب3v,&sHmjG g=b .ǂմc[VQ[u1ؕG\tI;J@uRAИ̕ QiRr /:Ȥ kbbA$LPD cHj *a5qaNJ 0 "VӏmQWFD!b+9@j[ ^eEN)Ҡ:0Z"v`k0щ(SUD34*!Hn\F "I%r-e2!'-@K9WA$EYc묦%⢭WVZ)K@T+oRTƖ" N@t hOTR'ucjڱy/=@ؚ"HNv7$O"E{rKp~`r4J$ӽt9<^ !1'(lM6mnCdy \r;\üQ?ڵ}{T| u9PL' tG|zX@5)H )TYNt!h2YY+s\J`QeR?ǸL/R>*цآOaYb7ve5!~>5On ^q/}+7wh#1kMyפW+dv@{uN]Rޤ~MZ`vvy}7)q%.Z$YOv| -98qM=zDO/Eω@'#N>,HLy>,fwDw[ls}w AT @3qeFξ"$M0F8LVÎt 5ѥ4mtYVhUGҪGzGa³# 2欃$Db,F `R+ )92 n,ս% KeD_ \eȴDa,pϭ1&dὦf2.]M2FkF+r^wչ W.$Χ䣣VN1[ĥ)>7&l3& 3Ha]mkzpr3;#噖g DIc r 2BXo)d)8YNp [xGW^#l4$KӀTu~+PՇ枵HLWWr67Hi߾RY+ct@YSM6V\I^H.Lst>ZՂ嘽ΪDߠu% $8KasZ rtdT90WقނYoKNdol<+γg;y/ dHc!ݸU/)>7m0"}ڭ/Gx"lS9;4f燣$8g!}0t"kNAqAbԞs,Y/la/Pty 00SPNi!?BZW) ke@Pcf] JY#Yd#,/w@2N):Av! Ev]TYM K;x3/SuDL>\l~x^{;%*-__<+}:|eqܪ(t9s3BBrG2$̈;<ÛEg)t(RESG 2(G8?ZdIV 8wSy ]w|`=Fzˤ¡I1'ɜce-<=*]۰"#D-]8@gլ~c[c}H-q(dhP/"E=_/DBVK5@.fTomcdJJNhGY(w|2[=cz#:Hwܺ ma-]]@wm4N_t6Y%3"eسr" fީ`χm3Xy4mg 4m^՛tFmf;MČ. rV|ьpLźS .Б"D mtfKiXl4mSGeMNԏ(xA(G~Ip|@Ž'Vባ#;o\^xn,.B.q].wqB7aCpBro)iw*Nm zEC>o~y5ت~+ٻ/-"/+Z.>'^òOC>oRD&{y"EH YUXv7yBק]O>)n߾ _ Lh"?-Tpf G[Чw;zy^{~h(ڽ|[2RV "R'CJƜ )!i8zRR|(:$qJuAvꞩTWl _ҋ˔-Um-*ת>Ҿ ȑ֤ 1DZ)TYb"Ι*h?{ײƎdE˙( f:z1$]$/I[2)H%J-UBU%'F.ԫѧVѧ# {Jۼb󈖓 ffs5ojո{v딗v旼}|cO6O=zA=ORqym=_g\T_)NzIu󽾮PaAO7nxIыޒXJ$ţ}Iq ?hUӪԪxԓvZAutGc- {dd%.f'Xg %' VWm7v}]ab EBTΙl1^!0$L(0V䚪cG/C΄ 5RAd HF2A`:T [Yks/ITٲbm:S[#7!o%WBhW둵9~DԗWK96z=z>tlƽNڪ>C]7' ot솪 efZ~r eM[gd*ߓ<3F3<׍hx]t̴p+8dߜ]eoxQ}êkP6s+qs͹yO:̣8wfVZ ߧ4EJaNwQ"i"b~=P<%TtZjⲰX')R*E ]}ZBz&4([ VEP9z'a2"4fͺs${aLt6=0xƠӏw<_ohd''9oo3zr_^)jJT0QDQZhN9ʡxQ y(жJcFu`0B݁V`?o+qWa1V:d;l7nq{IameP|smyP FRGd;L2B<1 *9zUNY>@ \9ޢ<JHw#o>e@ShAZ4̘@m֝5Lnư)飘KNNRyKϼ度^r]10.Acɪu(6E"lxv2ʎ&g(j&ejTTzrG%ɗe(XIͺs3*|a38ƾ p'q"㞕#i1 ]&{l$1, NT4#"CڷYBR)Z$̥%v_<_:N7!ZÇ&mw/n~|G+n)[b2~ұ˜t+EПDpxlPwKВP'{!h!xb!=2!xb_U[JfS=ѱ'BܯBT1'Lj.H.QXX#4#CѦ,sd-}ń)>xR펕{h Bȥ#jcIfA"!yH*M)YlRa(YjalHD)饐f6q[χNr~@X0ӭtJmkZGg[MS]=evť٭ږo3z+Q#㶩Q-옂G`k3c| ZM%(T&I%:N<,ՒZӞ䍛[xÊ+H~PbI+mn[5=ڸ.jP0v6*0!*(Ͱq8l^ơ^G Y2:PJd+rdԒ()WDT٢1?%&tve݉{}0vk%;yq ?M{i24 񌞹sr)Qi(1BC\J_t MA_3eTHbIyfN9tȃ$ZYw'q{35,Jcov| '|%f 8[fś1})OرEj!т EtZk쀝w"eAXTLV6N1Nibs%$_SmydDKj4s]ՂϫS=v旼|㩿ӯ?ǓqOql (M@ 3 )*(4` igr"I'߶1Rimb!fr OQ&)$OE!:U}Zu){nC[GRgd;7~>N_`{. +ALj?:2r!QFla E2KxE'MKZH@ϯ 6]enO=q6ν[zY/ڃT )Qo>Y(Dh)g僁, l:+׋O,F;>a8]la\)98Yw.iUd%J63w F9 igՖ'ap /X5&Zg\g@!R+N&Jqb1W*&"pTN FVVDCi SҪD1^k֝#ɑpO{溿O_|&7[[>c3uvy6D,%+j)܌QRWΨ4Gf.ZjJŲgeMrpSZ(0$ʠbB6&Bl:@ ٨~I.d/.Cz(jp7d#‰>&@7:NHXta,hF :,0hBĀ^Y 9 nGz6疱mf!G տ̟4P|w3SͯKL[NbnW\F^S/RotZc]G(?[Έ_ɍ.dt]UM>]n>#؄:M'~=xVU {yy}Obͳ_oj9ZEL u$3>ǜp)?܏'[Ѝ8*Jez]kXX˨Y4G= {43Cs#~{8s=Vf[x-zeß3lه5voý :=nW-uJ]'ڮ a5r-_diC[{ʉ>><;ϿZ}ߝ]"{Sܞ k.=;њuebΧ=|O1:Cܥ4nl Zuؼ hϡ+3ek=7p M[~Mevrtr`Tĝ WJpQA|?-K*Qїb<@}ߍn*uwְJ;MԷUSJ2I8Jq#VvVMK\ڟAnqsV\xmM]m'x&/yZy *lnޯzrgrӃ콻n@Mven/';)nds:9isjR'hg1/uP;_wّ@\ujx :z,`WYZOʴpj 6ztYVsM W[k W[I ӂ+=+~FpgWgWYZNpQ-<#s~6pr.p'WYJٲW ,:2q.p*K 7W8ĮF \s,-WY|p%Ĝ\e,WUVS,n՛+m+hopUebIЎʬx"}\By.ӿmRHqjK#: "+/_`ǟ~oזKNu ~bF\Xpɔ紧$&|iT˼'Hop^V|+qƍPz'vSzp=#aWY\Pڽ/)WYJڮzE2\yNpb)ⲳaWYZqYʅ\Iqח?:Es07; $Rp]ؐj3MS=,`6Ѳ;+}V*[;rE/v~m>P/rDH\kJ=!&ԣ_$с>ipTȬnmuD֥^ժ\^o:v擋B5 rt1$>!X~,_r݇ dl޿>β"j]EyOi1ʺ}-ή|{7j8,mLTOL$ RǓm4$e-ra{pLDc! hK .Tb"+P%Z3+a`EK^R᠋{,UҪzG§ޙ5SiKTbCJ%"BYS*BAŽ}arRM* (H'iaH̺QzӴM_ :[%K))'S!9',&cF!u[%B<u":e|*Nq8T |L>Zpˆ(f8Wr<2a]x$*i K0` +H@-lyLC#XrQh$8TFHWR-bO2S˄.XEh[X.'n+E_.i!L*XLԐgIj]Mґ~#Um?%6})t IEeȼCpG6/$@Ajet)IIHxGVsiq8{ڃt+|ڻOM娃֥9.8S4Q4gp$ߝO%uVKIʡ{nD-".?<C d R+P>?0$J)5CZ{c6%I†FQ)qC 4䄧4g3),CsT9* Y@o:1q68k`5w.W u}L_?on*O&[zjQ?NߎQ7F3Rp`)Y͔`qws$"Mhh <`4-y^呁ˠcUҢ_iy,F2m 9ֵdn}/Ճ-=##[!f!EN%&Ŝ:׈e }Hrs  Z}cV?_X?%1Xݔ4A2zNHQӞ]=/ 6 !KMx6[y,Ưw-0.sSu#RsGA:|EGbLR~>.g^o{{9\vi02Y?(TC/aa|R]Nj!;z1'Sk`8'f:vOmoxm?uG79 KOFNٽ*UltÛZq&ģ2X u=b랼*Gl2;n:{l`ad.cuy.%vܸskiRx=V=/m`&Ƶ UtW~6L><9bA7y ;=Cl %6zD zGh_;ߌ!Ywo3B.ܧY}yfBed^r_lRk+;#N>_úc5?ck*&fO:O=z)-5 CRgA+jv/8ކ2x8.>QgA)~ϨPN(:)}rlmVgk<-QP]^ iΊ. : ;W$Y(ךY bAfA^2r'AD ιC}AcC*M3 )EA/N1OH)Ie$bUٗFŠzIM]"P#3HR gpy`NEr4_P]ͣRUU}QXՋ’t9L :J&Dۓ1G N BĺrD#;Sͽ56b? I_L67B`go'M:@N?{qB_9y/n4i:A?4OK1E*$U)wHQ'RJ" Ywff3;LMur5KݼeOlW r< F5^/G+|<ijq"Ҏ_[|y{/^:5鬩^OnB5"mSM~4`FE+ JQ]۹ܞ#š4= $Bdj2mTKP7;sgAmR$$H;M #(w[ %QMnpR9& {ai{!i2T۝ӴQ;1ֻ쭵,,]`5IMm%:͂]3fO5COw͇^N1ɶ.DDIZS`)SW]^ Oٗw^,]7]5wL,u5W҄׮w-7(X0yλ!wkovu}rkaϣes:^AMWa:֜=M63oψ4[_0]hh߂֝>,Xe|˾Je:Ti<fh#ί,PIM!Ho`z}ޣ0)hv[ON=8)T$R RƂZ]uM uZެ-d~?{,$Ao4&G(0#r@;bZIq@e!ޠ8;NQ9˩gUTt;t2w-D+VOb66AᇣqSwR3BQ>o99~Ǝ^%\\Cx1}ډ`N!sQ܉\?~j'jȇv:rqLQ8:1ĞYh9s1v1 mAݯ?MD袭#^MB.ۊ6Qex"ϗY~ë0g1KkrSJU୫$euo"1"\_ktkmDֿ%-kBڠh. "uh6DkҨT(u\,׫EC)ӢJ ưa'~5V3#\С(IjN22n9Hj.n94}*?Ǔ\cKFڏ >e FsnOp3hUɤ@PlJ<.C6l`;8jK%8 A1TBێhq1tKYpL}Ÿc[- kmiY7:[ABT/@&$ K+"aD,KB*@dHhϣB]"ED: ?.$PXFma}X6gY1F,jDUX#N#vN ь dhʵp 0yB1J'D "BhJTh:(S<(`)eDӒG=Z %F kblP,vdz$}um)u;x'. &u 1H3"dVRp+TӋE㎭~NHI}8TXJ q Yek?QPDsM!7)5EC=eq>qH|9oj*^ }{iew[lYtXX}_vA%2WsCIMd "tJT@A!T"2hW `۠d[[<{,.ԃEw] z/qOcǽqk߽݇緽<0~LVQ_H7d)ޱ^ٝ[E:'ILyO[׫e=# kCݝDF憒#}Jh>'%xreEDS2V Emh)NYWůR4Rq\T\þTJ}8ap@*,L:u5lA4,_]e:uɕp(*SɾLbN]}1JJ$; us =uUCQWڇ vWWJ "Օzϯէyޥ3[Ś|vv2S.^>s898MD`7iie}SyNf_~=8|nXpÒV܇\sͷ~_?_۟EM߽:9 z Y'GN895Ă Z9t. v&MDqL1>ki8 a (_<Pj}{1oxF@H N`#qI Uh:AԉA2Nj{BX茁4T#SN(v? |fV%rFyt#QϢPAx[rdg?Sztk5Ldܛ|eBf Nyȟ^ů84VD|Ĉ@+þh%`ygf5"-naŝeiVd@1XqUm~t [s-ZMZ^Ov]X w>޽Wx4wA;B|6Y9@,~Zl'AH1 H' d ::{ hbLRvxTQJ<.O˓d<.O˓d:d<.O}HPTJTFPI* V6 QE&hZμ謬>lGj=k"ꥣ;i3Wo([-S7mEp@.4CEmHp;m<8v(Cd(N!tblhm3eJ( xx5mSd֣)wIa|ZFW·ܤIQƼhG1(UBјdȽp# $x?|j\P3@; ddJ!EF3/':i 6 ъR- B\d Pp-WkBƭؑţNX_B$qzVDTK(2|0&< hhBB:(ܠ|Oښ6]d:A*+H8%"b'B9eǎ W?sWиZbК #>NzÏu'Ur݅S_}4~L֊m½,wy^C&,F-=^s{v3⟵_͕7W/*1a5^Y[nvQ6jRW~yf 6֙@>E4uX41QyN>?z|=䎇M䭳2rlYoRYǹCHX m8ƨR/{ğc6 n\/4Zwqa9|soO']׺me&^}k4]4㞟]k9?rbb4E1^r";*r_Wx#>:=Zƀ][ rv@2p*!C)e{_l!S +[Bq]zWF+ޕQʨweԻ217EG*¹}?%F8~ȤI✻ B1Ʀgո}:Z?@Z߫{ }}([?ڻ~ Zl;DHKXMѐ4A rE5]KLPNBeʣqoE4 @ z\V}GY?0ղvi "D_!T~!9FM@_)yڤ (PxKh͖'nVN$m Q>-6))'S!9',&cF!uѣM.e 빱1$G*& nQ3⩣Fk6#"̧$ضR|`$ naq{%W[3I+SnْwS)vb*1cJ*+YL[Bͼwa[sZhS=%j H?۰XCBQ|t(&%lh)Y#n՜|mئU`s)J3L"9*P6@xf3;yK#u oP -Q8"q牖4$Oyf1qvkpD>.1˛Æׅ|*'C'扵ILWߊ|^~XGoO*~@T9 `Rj'm~Sf~vV Hm\s۸tե'2m18e'U60)yH9&;$Qcrɣ:rMxzМqSX_-:"9 FVG#M!LDbh.InqR%Ol\L3x%zx2Һ$.v~u``WtYPZ>kt?23~G䠀Q)o\kUTT4(I ^HxW;cT;3(o7Sh/y"2m \@Ajet)IǰEH8tǠVsy)x6oقޛ}eK;z*>)Іtc:!)zIpURFrh[ 8f|GA7;`tLA aGgq@Y)f8+M BD(0 *%.}b"A434 r&prr&e8U s,PK,&.V#~zuo9l~Sdvjɳ}DxjoaP5WdF3Rp`)Y͔`qwr$"Mhh^#zC1 CXKM9ל2z $4PMf}YKşͤ3d]kqdhP{Җ]m翇R m PM;q'\kC,ؠbxE΀bQ1"b^CJZ!"6 F:a掂t 9epi&)DKKꏄUbu3h%Nh5ǢPIK1,J1I]oQj!{1'Skh<ΑCzvO`|kOr^o4!Koڿ^}tFy]?ߴ/o™9 ci$@׍DGrǛ\b'ľnܕ}yg97$\Ǻyu3־&ca/`mŸ0hBP8q;/kCSm߃:i`} H>[=.UCѨO??Z#_7cJm^Â~H!t?k/H&_F~ź~CmnW<#6/.9,sBH"y_I7N[/e]#52E0zk#ͤk$e,GIOC(.榶n=<ë?{RdxrhuX3S^) i4kaC鏶N/r&|}CWں# lįM=VŦBk#ɛwm d8NN@MNgKh״')'[`hY[lV˕7͆-}.9H $hmxd_ ɆԶڮ0?Mi"ΰMm=f,6ju:_7h}moXw/f%ѝPg{Z\o^xv9AE<Æ"7AFs>;Kōى/Mu^PE>g_gHZxz؟rA!!XOYR)FIƌ)qDEN ϕy>T8y^ƒy]>Hq]Cs/+gU♡\Lv%x-:EG~hgǟbk_MjOhElw^ꗞ#b4CvNK(hu+vt2i콛׉ ?5 JJASB8yQK& G\yV/K<>6>f?A& . ڪ6]NmxicaמPUcCvOkF?7޸ߙM|? zOSq7yD\HmSǵnGoUut7yu<#.Oz-]ftݮ»w=>]n=:Lȍ5Ytf"Vfb?2||~8vy|qy]v}S)=/ܬWGo~G i/M϶-@9g.:Ts>y핧ί>ó;7/]5STmС.~`uTHujmj9m rĨbRHEH2De!MWI@~%8oO44H-ב3 ̈}(8Vhf1 vxJ4kp79ZV&w,Wl[:lWNA#+LvP;7Ђi5K7-`u:TN)UBǪ*$B#uh2Xռz, V6j iIe`7QQ*Sw$XNY̷n.ݼφ>N ˲Kf4Yc.qT4#7xQd 8F! Jjt>$&yщ(ktT HG¿hp*Wt-fE/+4IJDl>Q):ᒐzE. .R@+kB$^ȴHB'(-َQ?)x.XL?EDUU"nP ͘0N&F)ڨ\ GcA+.P=!TtB *"e*FSBBQF$AKg\p^CMKhI3P* >'N)!!mWS7,%)7i|b¹A8Qt:iuoQSJxЖ2q1p\v#!%𱿽VhEl#z7~ZR^Y`S+{9PZCȹRR֕KyIPWY`#/D.L;\e)WrW/^S˷Umj/)Wr\ , \eq\ \ei9wRjwW,c$vy1p޵6#bif,|5ЇFi=e >meIT >Tʖl$۴-y([夘LFx *jzHi`>N\Rɘ"bPZŏ\)shn)+؊1WEܵm#7WEZ)\)4WRK +X2WE\u2dHk\ze0W\)*{Ey4׹xܼ)/*.^L/{ZsޗzU1~x$:XO@9Ff0|?8gݧgH9n/U?\?ig6g_sKrmn}s/Lwl3gwl]<]y\/U|l p|8:H ԣ&@9GZ2)쓖*sT[Q%!b)9mRsf7 j`Q -3Z#845>PpEiJB w|q=[Ť=febb,.[>ܼv$9ҔBt&Y^1C<ήxxGUFz72$]m?.~QZY+bt(2@%ս6 WC9pN; w;K"ebN~8tvѯWYOx{/O}\珺Wժ[w q{?U'URO۵Z%sM\QcGke3N-WJl20N0J5peDP3 ۇ%]{}#IvB$r栂Qxd{\"kT+go@gb>^s<]98cqjU M)ߓZ\L2JH&p̉i͑.)Pr2WgoY t={ wNVS]Lr;X&|#f)đIRpEp]3Z*J#L@M-2P=-Þ|1nsUy=) rNXࣱ))s, 4pQQE QJtL!N{x_LmɁoi?M?}t65 \ە%Wڼ6ZA/Ϳ&Wͤ'qw lv;p 1+$tk -[ڲn?\lFH oyHDJ |6@Zu$ H0;,78'еKij颬 2VUـᨆǛ᨞tTo (sB P`yBJ"-cOPPTCrH#38iҨ1tU\5qH,v 9^-a< "p_UKmn{o)/^% vy1b"י[xpZr˅/\1dIa@ l0#|:og_|z}֌.WUЬmT|vuq%|4u7ϥ[ڕxSgWn@eɆW,Obz|Rd] E.krv1ANjE<g;i.F/s^=_zպvD<]ǻ9xO k#& R 2nf6)ѣ)K7;%MP_tϠ/ڄnsܛӛ7ȊjmloZ/rOG@695-0#7Coim'dq@\a}4m}Ş`V~Ex+I6vĹgĹ_pc?ӽ G w&[̡o.ͷfcFײ 2xUi]\_wHε{ffOm^IW`gF"PE  Z޷{:60z, M&i(4zъw7lxܩF5v1]t#Z9|sH&ᨬDȜrFeMd_ªe?ƝiReӼ92 *U8cG⧧&[pY!d.. !8}M%g";-*YDWf`I/gZ^mjscc[ z֦;|y1q۪GJX!*˓Iz[7 T qc4'0 QK*; ӫLIyAOѳ<^.RHq^@H-8[+\s͵uf[wH;cG_aIS7l_FG0^l%_@PDb 5(rAqAAZ ^̼EP@37JIk郲!BkLSc:ZhwiVjO85~m OELD>8eb 0ۈs1 MBQAW9 uCo$Kf4m&wn~Vb7Aly#~kb5ӊ[,3#=@ -Wё9 \,[dj>X'_ƴ#B-hiM BVPEp%mPsFLE4`)Ch"ǡV g;(՟MzlN^.~.бGrl\?b_ݰcVԓ˜sY1  K *HF [E\ҌRxttpC霡tΑΡ M yi2eL8= Cv+f2O}G]u+PRR2@fS18-Hgՙ!d.{g+XMs.m_̿`c.r^~υhzk=[wpO\2֮ YevO*|x٢ӏ&C7;E[)ґ+@hCG1_\Edٺba! Rpخwkx|׸F絖|:n0!utrshSbojtw۝ XBqw{={:nF\ f^XnR\>+w13rq>qLQ bT {$mFrm E7VkI:eRRi*ef(c)cJK^r!k7죁$º#oY)PtV H)y*GdrrgS6 +j@!I~|q)Gtɯo 0%2RVg. 9R|&!Yyfp$9̪sxImUԚ5".ԁw(*8e]I⢭SzQyH*ʑcH*( RN꘭ѥhrSTL ǘHնoajx-mlY%Z|)ȸggvy@[ 7Qx[q?fAX)9+"S+d"RY+P0"C!{.r*) :#f6!;!e8-N<]mvH,LVcx06q>,N)c[1XuÌ;FA VØ8r#Sʤ0KmG5j 4ڃ(#ZC Q\GJSir4K^yp-7΁'̈́扦) 3bcp*\uA)xvMƴd]^ xŇ4>R:2T?UJ ,L̇HėW{Y|zX!}AcAwP>r|>y?]0X|7,:C_, .O8ZCvvf(-2E8 V;b-8|}Ge~i |P5>)([6CN<\l0+y>,/gWw7WJUpÔnu},?*~}\1Q7D;k`&ٷQh0.^|GF (ʁ"xY+ zhct=͡Mˣhjj%擵w<=qS,a\*)۩Dc/Sq2Lys4}ouBЌ՝ڒм&MbԊdR7oS+*<%:?gQ "Rw_GIA$Q=5}_DGGŏb?\&auԕ~WPiAjڛ5͍ҴhҮm+v%7uYA!l Uq᷸W|q5t5}lhN<A*^d8yw?'6~E~6 J̈1\:mًQ2>y8 =۩Kh&ujdbw 4#):p#PiD2O 49SёT6EHf7@=J kJ3\>g[+kVsbۄjd]WMDWkoE~u6`o|+j[̈́2?rXX׭<4QJk0[c'ԛ$NH9e]e%)fX+Wmܳ7: V;fa^1 ɨd堿Z9))f-yiTviyf[OhBsz-?Z)b29rL+%IMMyNKHϤ`cBZȋgCŽe%7> E؅ +{jУcg:2\ Ѝ`n,@̆rc7 J kU@tЕ5Zh ]ZmUCЕ(z0%-윮j|U=bACvDWPj.5Jwt詒l.ILC-AϽ~ @| T89Á_8cerOe`DA~֑ihz1cld*DiUnԱEg;\oX`NyQP;׳iFJ(y<ѯzlnJ ,"<ϼ_th6a)=Sڐ߿9>$朧HmE$Bo"YF$ۄdʹe=y,q*SLm,l$em'U;ʹ.MAł,Z-H\t$Xsq*ky%"X$SAT 64O\ߖ Fʆb#ZKbKHbͬ4 BCWWP jvbEս`5OZpaSB: k$t(ig]#]IB Bӕ$ ]!ZjNWw>ҕ"J])UDBWV[vv!E+Md4 B9תP *zB e "`3p m+DimGW{HWJB,B&g ZDDWMPZݵuU.q"C=\솮\Hl]te:ZWނP{=O j :Bev;zbe]A<B:8^ewt$tADWp ]!\uh k;]JKHGW{HWiBY@t e̶ЕdqH ~ :ZK[ "JYWHW3JBp@ &BV]#]ḾϽ |^'c˫s, VI{?3 0"j L|:<, `Hj.aV%:fPև%7&>2+8wl J(6CKv"[,/щ^\ &-(AY @ڊE .*Z7ZUXIMXX7 \B!Nя?GP,uSˍaq;L&gq1c@G[^&Wʤe)*fʹYL-86 \-?ԫBqTh?_DbN(NG0qa0/o4+⍛ozkǪ74#jaR nxy _&#ű.}Yel2,5u[0Eiw_6./`ecTޜ4*jǕ,qP y|2}ϗ(1!nng]ʔ+CAnk^^ ifK5Pޔ/ xQ LR&hq&ck#:qB@ [h֍b{/w-ت7#ɺm}Ž;RCÉp eъ֧!J$Nk%RsoUhcbЅX~G{pq-) Y>t_H\J& e:Rbn)@G5{Ŏ@ nZ+H#řuG9wz2*n-Edo!9g LwqߝCB[E塭۵ɧNm68'PMM)9gҦfíV?wUyW<qۊ!Հ]wCPdjkqlY[х+Xuœb K率Ke& \)pD   ^vӃ͢5-Gsi:ΥT17VG~1~lnJ ,!3/%up<3~x)[m8` W6,W&BsqV7X8׉s1r=l:)oW\̈́\)jm[ $ F;] u]zhwu~=msf^˸tp) o]ڜ%]!] K: l9 U:(E*Gvt$t%~2ѕF>Aeަ *uhdm+!-Sut%gDWK ]!\C+@+H+DXGW{HWJ)T668W`AD[o]!ʶIIJ+<]!`u5!LBW +S+x8tp m}"ԴQ$(jG"P;դt(%DOIvV=Y=&&ʅ6%|uDO-,$-+P 02]1]#]qk6zSp3pҡU!=+I: Dm+KI0孧+D)IGW{HW*T@t%q劚heADiGW{HWRam@tR5"d=+C< c  ]p[ -o ]jRÞt:mJWXjG;Z] oG+D2T՗`8/ĈX O1E2JSIH"aJc,%4]OۄQ.œN{jM\xeIDwRB\zCO;)Xp쁿b1aO2/nWF|;_}7Kz@)FMe27Z -uYSMwvP(;CwK~4.v[w(;t;Kyb.6cuDpS˳ \ei=tRj7W1gJ*.RmX_&߉ DY?QզʉZ@Q!3̘{=m(zq|qaгe2~H}9]_s*v/(.S@Gu5Skrjq9$G M. Z^b I SW4"4$>$c x4S3^:p̓ĴGpJ8/w/99x˜"R)m@] &g d1Zw޾imLOVr&րy5Xll෍D h;;EJf&*Dei5;(ryd3 D %=sTX4ÞB峿ETCHxTTh`YaRyf^jUHh;7ioҰ?]vܞYBE DCS4#zQd hDJh5K:/VC@1^:*Q`Q-C"8N1.@1+}͆Z8wj|?#-Euv1U4 9RfOO⧬!. P>+L;-^^_5] uPFm^Ԏ:9&YNݑS ʙjqo?p]|wjk8bYK$U$46(pVJMavUЂRBr?It 'zP!ǂAV R6!HՍs72Uj4c_,5chAp.Q"35*0[hsUǽr}O|uWؒрcyΏ;cOD >9^LJ M *ލ'fΞ pg6T^3A%6'J׽.qFl7%PP6eͨ-nm *[I";n TrM(2fOTzpIHeӂz )yT5!/YdK$QxD@c5am܍S?)x(X~싈fDT "6@6@4c8$:r-e@Pm BVD4"2 $( X8ZGOVQe5#bmlqbYLbiɾhjEb4>q1sNpT}'AZ[T)%z^9 kӎPR';VU^x^wFbBNQT X׬~|ՏpI؃0Yzqv4V/M@<SVC!ϳB*S8L s*f5}NTZڞ&0OwZjli?k!pl̴@O[gEuX%(at\chRxh6xʠ_ZJ#uj㚯2ٯo:*wQ [cLʏ8&Xvuil ٛcͿ5.e^q+%ߢ|{JѢ-w\Wxcrwaw^&},䳝g׈C-qKTq*w%3mCdDT1ot8\4z ^6zi8\sCsÁ@5i%±BkW$Bdj5NR-Q;b,fGK;Z3Ca}^qU9L9fz: ou}مE+Cg銡Dhs9lߧk L]?.&7g/tfs=QnRnև`!ҢVk4;267O}8pٱK5g՟%ݜOzǜAygsS$3Oh 3ʶ^zkJP(([$DSb̋|ǢD7"Cˇx_c}pZfZ/` BLJo%;i3WH5ܨ3pG3B3E2F iYm\D"F24CBݫjg\&0׽y\:v8KbבܐElvz'kd.pgG 7yy!'##%Q h1{)<7G5k R,|I:{=|a2"S %4SUR oTED%Ȼh<vJPq~>;5cjCA0QPTz h,9hlA- /1$F@H  \p^2= >8 ~THtѣ !kHsbBloXg׋#.kC[ P Szs9#"5X}QD$i5Mc`9Q6rh}in\fwSB x/oV kp&so6 PrC~$-xzWr xގ=SL~kIԬ䍣T/f ].w*:TF6:RUqM楖y"#l$,P}~|DJ?mLf:Vfj X#N-Z^|z?vK_6U1_` <e%V=6'-?6VRTUI;h$yE|/|3>?{3 2>]+K.! Y\6EC66:ܟn Yן B4r<6i>u6xorLAJ- L3fgdn7ٺ&p "< y%F5^-; 俥]v7}˞E9"cd9sQG$)Yb/DX@ĽH܋Mfk%FEBZJUYk&o\i4pSHݷ^fkZ!2Y|V\]N!T6ceF>,0[71+SY3ѣjCvLˉ'["7{O_=Q+an;=vZe[yc>p9赟= PfFQo|m-pA+9reISlt+vZ7T5wT7&X|s/8R[*K λ[ڈ9fB;}e)vLMōׂ 1I%NԑF&5Nͧ0wZ*DkRr*764.D5g*erS쫎L>lV8B-Hɉ7Z+7ٱ"lgGt\-g<۞xnǐg97Z{;<=ӀH[x4nx%zw_0HT d Pإd`[q*;osuRF=N?ٿb9 qi785_nBHo7dMrɿq 5uלQ]ff_w۫yW~E*^ݩ}!.Ԟ5'Vc^fDy^x°י8(O˚nz?/<4:7s7taFnveoVа{b75C.XUA2l|@q$FڞvPOD?,_fߝh0COcn]J0F;q_#vQ\Jv`W. Fls;{Ik̦tjVqQcu!kUݴ7>Z[-8z6; Z짥@.Fesu,a/tBYw \ĬF! &03c{&u_tc@\{jl+UtloY{[<(|MҔ",ҽp{D.=Ɍ؍  =L\7ǘ;rZʝ!r w၌HfQ߲+9"mF)]YG\{@=ZPu" #;)FqPafc.bHRÍWp)v=9xRFGz# Q3Bx`ʈ*'OvqP}*THqQsEWT\V)36;QR"XZY{zz*Q]0\vѩVBE36'C@Yb3Bxh19ͬݒCZ#w.ͺZ4463ܐNEA- vVb> 5ԾH&8 c*4K66K3i[EŬq,MB`ːnMmJwbE#7togTQ%_V D *۵`w!;TʙX[7 N7` ԃX+J=HISց xf J.ŊTX1.9Y` Jܢ0BMȘT& wK)(}n;@X,i8nE@:*{*x!ͣ0(O hv=!#ȗ,(tE^@(z&Q^=:!2Їh]Ƶ.":0{K*5ft6ZK!8,X]dP?QiyUs& m$9FI;L_d`4cy'cS/4&X5o,GEd Ǡ]Xjנms@ ѨT%Pb7`^H0.Aʀv a%|KuRQj`GFZ$5@G3P"O+ZnT `;V^/LEH)"0iEj,@2šjBAo 08\c ä0,* "tȳTWh! c:A#kg L{ctb84EEFgFi321xƪ m_ 2;tY#,yd<7!h_B]6>|A~Z6ieTa0-&[T0u1wol e刴<0+߯ (mwWZKt* Ee0\ƴʏ FbnŒ܋ɬXwäDl9t@C\^@72fhp_{hNJt9*c[BAu0XWR qG3':'X7 2G5Q0D_1h"A;n];fZX%r;y)q[0pr' ZDAZWQdr=L ?SנP-0 0mT51l7llaV֑xv=ܴ^)i3uq`2SW* rmkz}dsנRnR6c!2yqn5!P=6J Fx=X@ ƬI(2/̀~ je1㹐Qp4T^6|K%ZU,'7{qŎTDƢݒcS @l[qQzTΞ́l?yuvQw,otZUu"MCDS?tg߻CII#\F'huOߢNH#'raW- /F]R31ίHg3r1wXk+(WK.ܗA,8Ř9T,(}Q{SO9Jxɕ}yG$?L^?v ehk|C9?}ț_ۻQ ڗ7tnݻ۟ o >{/+f{lݷ;d/\Is]?MskmfW; 2)n+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1+1zFWV9]!`d]p]h}~FW# ]䤐_@s}+8kA~9zM.48ػR2y!R;ӓ[OXA<EE/LQJE\2jk,PE\-jWZ"qE\-jWZ"qE\-jWZ"qE\-jWZ"qE\-jWZ"qE\-jWZoW\mt"oFSX'5iťםG6r2aXEg_Njj{-*=[)3TOoz'|X&lx>vX8O7W}hS1{PhU)l),>S=9^ sGG ܪ{;;E;{n$?U/]+7/-c@[h xr\̥r#6_}-=W씣;"=_|nRQ%~U߉9>DR3ɿ$k \ƿF{eJ{ܐEOg\=]ْc*  8؇Vdy.K.1y .XhWJUgzmѷXv 1 hOm8hgJg}%/\Pz?38[ bKu*,פ;x @>ÛVG/}9]Nwf*A4d2nִ,)NO~)-..K->*gqsJM8\nN{޳NXF>tΫ~ C_#'7XU~7)_nՖ&_7ƭ}7^o>f|gQY;xυrThDL)k<2-t^] ]Iysy D1XNf Ж` f),bJ0̠3( J0̠3( J0̠3( J0̠3( J0̠3( J0̠3( J0̠3( J0̠3( J0̠3( J0̠3( J0̠3( J0x3g0s}ִ^NԴ"2_߿)~иlǮ%a4 {_p>4Cl4s}弢~xc `LMJD+4\c08cV.lj{z8HΈM^${ 0OօHF`&uɩjXNߛgŸ:cfF|tqMa+H⏽~!V|f5潵nݭфD+ֱJ6ch^3<㰼 8t=xgrγ~<5aHX{;`ךMT4>&!^S4^O+SjI'Elii)}T?%FDv4TxӢbHh]Wrʊ #t$P]#E_THu;}ovkj`5BSCzXw%O21c.QʸkEˮ>v͚AOKn%G:rQeh=4f!?.[yijnDD tW*Ow$ {!廑=   0SPR:vh(ZwJ)àWpciQb $08W멪kg2>\81 ąin3_ǂGSMy0@M4q@D-#1 ,Pc')̑|pg' clgOAcx`rd-2'M5pĻZ3/1LzLtچD%,qst;Two6EO:}YoO-/b–Duze7 ;|iJt? Q-c1G\[ℋFã>II9LM̰ )L=A0,Nxw7_,b4qk]=F0,?ͥz4-Ο¹yG_O)c\6hp9mP8?]Ng34隺ӿ@=Fֆ:(||E_b-N?[gM"<dzap44_ 1 +`1>3ʢwDc_6Ok7~9\=N YyP (ZScŶHF3b> R|<7{5mn]D9S٢&xS PN?x Jݖl/zf<}"g#GэOG IxTW(B@7fCpL{M&k zPUty5ے~.ݻnkm&/־tbk_] {Ϫ$Y(apI%&ab"C  (W\8C }*y pAJیp |bΠZ{\J+ Whi[\`PWSw\J WRP0 \\AL.B\WR\# F  k r @q՜V; ̮PWt}l.WUOeLߌTWqaCj,$o/M!Wx /jy1p `:I'?Ƒ;Lx>V (_H &{BQ]gdrVoNqZ Rl ^:iQmY0]\nhnuחmW tY-  {TXrBT`yZ _w D_(v?fUXY-EaN&$'dMlm*%HTD%Ю  }ڮ:MlR[W }ZuF:Z%*Jb+/>g u22J-̷FMr58ΪʺXWhJ_bww g袖ǖTKl^دzF$1,#\`x6B+T+lq*-:B\qb W(Xlpr WVw\JZpuh"3 $\|pjtWҴ؂@6\\rU߻B\!5Dwm&d+PkD\!Up ʵ 2&+T)t0)8W(Xlpru6{WWCcĕeFQ]ɵ JvTI՗+ªV{'zvI.#zM-;U7gp% ^[shF\&<\Ze+Tidq- W(XlprWV~v*u]#`re-W0Pfvjm=P!WG+!x^{W r Pn>+TkDq*QWG+XXQ+31p P$ *w\2:J\). tW(f~UrQpu\S=Nr:of*a i1MgWc|4yaݦ:GH|MiX;sMWV9`JiC< T+{?Djf+T>;m(lq*\(qe1os[\`+I6B>^ZK+TY+ªDQN 9 ~7>ɻZ~Tp% ^]q#{}4M´~R)ӿ# ϦhJ6-gSlNmSlSbҥxfNZng.if*Q*ק|ġJ(5"%6%KlTٷe&Kl4Q6#\`Jd6BT+TYq*[ WTFBd++(W U Upu <*G72=Tn*uz*2 V̮Pֹ /A%qR,vy>{W(d+PkXq*%):B\imi15 lV ,ĕ1I@0]\Isս߻BV\!Q<+\Yǵ QKG6TTie85:#\෈-+\\rլB\!8V4Bd+k]qf}dq%8Zf+<3Bo:l+T{gTq%9=8ĭu+WV UjRpuR: , ,QV;Tq%l;xz$F<,f#\{2v!\Φq1r;T7e~ğ*l.էE/52(+rrʊ #t$P׿u6i>-ڹdsj ,$6>OߧnWa>Q.~AȻӲ_ݥ;"Ժ5B0b4'AI e"u-k 05m^$$/ی`^!4S>4~]dD·'xjU˅@ϝ~F7W.*M…s(U܀=7}{*(o0M%Ѳ<R8_?߿^]:xk\b&T%y%\]W^ERy 2-{MNR/_s9#V">Yߺ; V^\Ubsil0iVX~@nPMvCd}~t{׋ײ֣=wydNpMt^(cyHf*ZhL2@^ 8Ƒn $Wx?(Fȵfv&qC w^ ֜ut\i1l"a֘D$7is΍0go.~.œi `vs&ybØ:h dPrSl]S7:OДgZh9+yDg#AI(OՔ% GyָLGh: 1̋e23(*~m_ӳG%(T0Ea}jC.95~@ֳJ&s+Np k(5 Q9[͈6_gO~j;?x n#v\sRҰ:>-7k(BV|-wG!m#h NۆaYaӭì2GGFG U;x4Z,t}=fuZMwLnݣ.&6j\E6џOBF)sr^JE-,:5 /5N-矊ϿڏKϮk]BģPW \tWՅwqS6T5Uq᫳GӇwGo?2}Gz{_Lꠕ`~}\>CHWCx}.g=_}5]Ne7Ն\ןFhm~ |viMloJ|K[M,M̖T9NPiVnc ~v0M?k+WbD Zsa߶:'>Q#忤uu-dZyR-ׂ ňH5BՀ&!썺,4aو/C"G\Qa- ,{-Se$kS0I2$IcE$EM!ˊ'^ս^}ƤQWzx+#mU][-^2eRhƗBOSX\!lHZhEmK'AQq5a<$`kD֝CYg4˞%CuQb%N^bAI.ٶXVJvQ2M4}Be)HIf .('El[D hHῼ3&H|hG'uF8iS!ď/-$jnrοkm.⯫_?Tu> hYM^;{Nݢ7nwzڙ隁|/XWbX$\n1[+vݺ9yQwReYΎ&Wx[v">w$QIsfJ B g\P@AjOd˽ńw! ,!Hp!J4B fZR hQ ŨƟShe&PV; 7Hc5q,牖4$OC"@W7\y$/Ў3 ܣ;_KoZP:aUg«Tѕ2瞤=.Xa#vX#ƭ)EFo`]_t̹[y@T۫|-nzCx NF]!޿vhxVR0IC21)#knu:t5+gAo3u`}Mg2asPo&MUeT,!@(QRZX&I{{O 󨹥Q %:$Q`>PiqˉRP2q/З|:o$Vw٤qsڳtC,>кCiϧ'Zw͞6ѭͲ37[rV %QF[ȉ1sdIhNW;`oHN;kd1hZ>֠)D]ҞiR %- Uv,soYo@'4e%-ZOYWt][dYre^=K9IWsR&rRKg)S)xWxϒ↬\ ufNw_ϲǑ8-ú(̝e)9Ě!hPjP!Y9vv5FZ~>Ƕetq,ݼ^q%ݵZ/j[˸r;2_l).q+ڡͶ7 0x5/Z# bn9 H6'+NBikͬc)>bV}٪Up9|e8e+!3A*I$>+Yt.7̥..H HmR Dxe (<%PL}VB#))'S!9',&cF!uq\8scc IyT&HE ψO9rT'qGg\o$V+7(SC=䱮G}z [_g9b>T23+ǽk#u;f"BDqeΓ*AyN}l›Jt^0__RSA'LpxZ_KH!8rd0>ȵ쁂6J4}cA4K~z!mǽ5tnwBIXMn}omZRKS4)\4 utbY^R'hMpbrH4kG7W O)H!TB 0nhj;+ k^X "H$ :U >P1@C: %8(IaʝQ1xΒj}~٣ 5~g\;ҋ?~yn{oN>-e>2_Y8B߸TbFP 2%0r6Z.DAS +nx2= iU=ysG.֏E"$$EG8HIC˴%RH[kN4O3o*Z!"Dky|I sZgDVɐd;p˘XMe渚?>FxIVt2ͻ\AvUnk}tR{Ѐʻ>k-@bkI>0AR⨩ssYeyI<_a8^l@f _Rȕl[:¶Q̈TIi^=Goa̜8fm4r8|cOk![aNccG{Dk |A.J;n!l8,i< -_ڔ/X\ղ05¥G&an­n;(t{f>YA2iy1IԕH]B=Ŗ9F=S xjO|9dKyRN JH@%bsgzH_iLIyDdF m;XO!)qD69ͣf3&gYq4ίZg'/]íٞ/_-Qpy&2c7|7qF0'(jVDRQQ&Ťe = O0%mw^HE(l.t^5\aW*oy`&o"'GyMZvD@}Ḣ ^vYBx5r]:*DPfxb|Ͻ6 }a|6`Qch!aNɬK2t)̉NNHΑ% }aڴۻq/C*gפ'(wV17:.ߖ4a[v+{&T~;ѝrO[T 'ݙL糕+EF+7li[%E"1Hd-h ȓ ڻ`$)V6 9 Is uHч$@-%x`lPV`3qV;~o^x;WfwȅIx6kK_MrDi"=ۀe[OWo}rz3/{s yh*UW"tCkg1+Neճ 5WK!\:7257|yrٝμ[NNS_AC_鯘8…}9{٩wj2A&C硯Oѡz?o|;9ͧ( UxrInD\/0{Sj?_. E-?ig%vOY~ p|I jg%c%k6{;ݎ+71ջ{xX?̸ﱏfSzϝ; |Xmˍy^zls폦@vf5;:5g ~Gqqj{mE_ 6oFo'gsQk<h޿1J*|C?Ν1of#-9KΐQd҃aq[73ࢗI% Jm.6 cXXT*aДkM^Hf]]vY邇-iɈ5eYJ,$A`Uv Cput1gSVUjjFz+[gjCkpOb&;=5aauRS"J]v}㩮 EȠa}+ ]+Ka]Se5deMъcI Rfkdcu- Z::f do ZQm٥S_~e"ڰѱ):Ԣey$daLJ ՜Vç75 :^M-?GYK+)C`lmAd 8Iwƴ]k!^Pr[rbvj۩BiyI4UDoO.#/6xC"ͮM#ƅ: ْR AOîQR1ڔm*.sRۜ4K Tkdl&؎4f.bc, 7fx~Qc=h|''t#Q2X@DS}秃;*Ys3tNc,We5Ⱦ)_TmJ&]:{>)CPZUF@%[ط;.1-/J;s[Pvڦ1j{gPE1z@>Z ߋq (D$3d,Ҙp1J`Ζ&eʊa!LlD,oĹ{~ b3XmcD"x'HZ@R)RQ5a y u; !9[ہF$25EDGRcu4AKW:YPLH!dٵԙbdOZJScDl& >f8% n^g3-Y]c\\KFȅm]Ԭx=E*[@}l퀋mfڱҾ-p~ a%x 2 '1H]J4fNqԮ$FmO4RJ5$DcZq~vz W &火I vW4S4B]ȴNW'y$h<4@(2:ُI |< #TlۗVJ ̪r+P˷{7EVU,T?uUP(t;U`t;U\v^V,j@W CpUֻWU\TWUMJIbWWQ!UmW鍵vpU忷fWWyG}n;U ᪊(ҒvtBW( VpUŕ+pUpUā]F2HVͧCWNٸ%ޫU>L[|3YTgY_ /%I3eI(&A1Z^c|[?v}}7Fo>'/=P/OO-rYҾ3{}*׉-.SGsGRA(eޗ{rHe)|HeM6"xgÇ iިyӌ{Ӓg{k #P[_~VP2.|F:0^vJ4بE\w~Ow ήtv&k$"]fXl&-.0ֺZ'ӏKmf Y E@,@ѮJ@gBg̢,R r GŸ~p 5$q%C.:[enw$#{!zFC$- _ !#%Erm)*ºgRN,G10[OGRxCjԗoXW.MR(c0f|?:<=nTRӴ壏C_?~8O[>4~,bv@\fP[^:vWme([[Wc4?wMȩ0{ӔdPH}ŏY>]ŏ>m)~,+q3N-)VWR64ΠZׂš\a kRDF!IZ6M6hZc2sK0.d^pJYxcK4g2[Sqy&ku ZXx}9,,JtUZ;/\Ce/|݈Rpk~FuQhZ(Qk9M)dLc FYg $mfnW/ς;{@R@P>LE6Qȡ@2QֶXkvQqB"gAP U@ߚuō B9[2LN7fhg]MffoMPPT>R:3AEY8g PPBb:d 21Z).ZZ/E@N?{%v\b_Vڢta 8S+]BhZͨV&LlIV<2XI S\Ljpf0v1'c[˭k:5n^R|Jן*n}ҥu8{"QpRwڤ|@4Z&II1ao#:!w_S-Qt0`]:tOo6 w;4%fziD~T49'M.vߪH|Ywz\:ϵD=I`D5Ა^ſ\2Nb!iSV>PX.el FC'Bs<8J=Ƈ<7|<:bW{f4z5r|E}OoǟO/]UBrV[%Ax?<~3xqA9U1"uBiiԋaiMn2DNJEFUZY.))8Ae6.j hսꚞ <hxԒtTD)U:f U'nbZ S0Dֺr+'JiYaL:+N,YakF֔e=u[7S)6Z"Ce1t* RqQSpJ 'O:.kj>js1b+ MR~D% Q,*_+tQ6ޚ8ÅtkȬQoK'ZL;4>nK6 Y'&(3QZW5=8}軤><[H+\| %5'(x3g:AP%d)94WJ23Ici?ۆǺK)!YeHc1X"Ҍ0.$`Bd]]U qr2v2{ٰs) ehgOH%PmGk%%& Zي˖ f ="ȠXco$t>t- [- 0፶ ;./b>pE,[CMg>fɯ(MIOIoQNZ'3j4ppx2x2H6&ssR,m.D$1YN1G  S|ir_̮?vB|6찄Fhظ۫En{n΂ԗr.HLxE՟"mρf$/$dl@*9:A_rxgU;(kwI3"1g qMZ]RґRj&~ ފ^Ϧܗguկ~)ZjW ee)d:<ݓCP!k)Wu2uɺǠ>o!o 3'/yQ&LZ!1 ^Ȁ2.RLrل7w] trk/i>.f7[F~"by[9;4f$s 4!}Rfxw=R4Dda'e.[ 0|OBtJ ւR iLVt 5fŪR."Y$<#,/>R ) \{b";.Ԏ&>l,tL> ɇJL/Kw{kmM~j`CM><#};f1_qܪ(t9/=W9;#4 +/'{\$1dEI|0"pK"2/ٕ[.bq(;;YaX'yBd]\NZCb%ſ_ҧ,u/՝ut I,Ft/Re/hr6C!VJ!>f0z4€|&qp'#CI/҂tG҇G7{}CndFk#P֋0Mb/R?2}};[Fk;!{sCq<&7v$;O ن;wt5[@Ђm=m#ǗzfNp"tn}!QGZM{ls']N9EE RvYgV*ўΨM3.rY+ɱ=Mg֛/}RWs$Ԩ OlT@@&ֻf eδ Cg0,)&ы h0Ḩ(EUNɥ0@nS{༷ qYVg. 읕΢W UVi]װ};|3?;o`|k=[YpYa[zzs9aoҽ)"Hܸ`'\0nwi,ZWt?̯w4r;ޯzo\#eZToyu}!}s~:'zvK`t+ =bOV/b? :s qs͹+nYٿAk Y9:"\#!f7%n](1I6mI N.🧤Rfi߿ߞ&-xp&%tKzJF$k} d[=kidž8Mo-H:J5BuZG׶iht-],X jX~Xfk+;宵 w. hezE-nۻd0|\yQ&7V5`vDqyz7[Z.F?7?~4]4zs3vhQ+_o5Oe,usR 2 n8S6slLhs4()y!{ceJS։)J3 D*5 5O+S\ 39j# 'MG"{AX,γB2YZhP$^ipltwrfߡPIPUuwa텶,A}b7ɍ۳777+V˥k7DSV̸ @XGj1~^$BX>aQ4kˈ(eoΓҲsEcDwcѰ>G2SHH-JH)y*GdrrFdSb<5 +i@!!݊F^ )'tɿl psd\YQ_Q R\!Oyrl8<<=y=ڮjEh%\3+P\UpʺhҢSyYE r et1{ :ZL̒ 2=T*B"Mt+8GpTmXMVf ͌cm!V8“ _>dܳ3x<܎hиxi|=}[ ?f`AXN ĪT 7 ɐFfa(dEBb6AWQgv,&b9̂6rkb$QsfDZV[Ujݙ)y NP$Hu,`[[!Mr2:AcNFX2W&\F]9C&dȁXtI%0Ap$E83FRVcHpWVg=lIsDZQWzE܉"ЬhʊH)qTis`-v`ku DE34QG$Yi.I%2I iA-b5qgwU{e9uV%E[..vq׌!eu Qʛ3e~c rvZx 8'.TI=SsfQмc=\h{xV17q "w9 ޏ/htOLΧ㴬՗F6񽺬 9A9gk@`mFx 1pmx>̯nҮtx7މKrz{̪kt9ΩuI,~&!=4e28ˉ!1N,㊐%^c/탒}noq!_WitNa[+h<-@^n]Y+?њS ގFߦAnm}''OtWdVw=ڷ<RrE}]X<ӱ?:Mzf't?ߖ1fzh)SeOrnsPdR..\vq6Ď"S`j>ƫ/TC~G~wA %)K?%8nN@]^gg y&wm_zl>/En)v[M|ڪmɵ$wxt$KD=ulgb/@]ufAđA#{VDTK(NSqD\BE4o^پi\tTroHV2噵6%,Ortt!H'^Oap $Ip. i"# @qiKƹ'c2T> NubZ6,2!V8P]NMϿ{X\{pG;{FGds"[D\gq@qQt"ʋapm}lt=cIm{JB-O |< 5)j$8;krt$#;+wyw4:jՒD 87ZOHt6pMByt|Z\M)p@vY{,>dtѷ!Rw|:׶V$(ȾWkGU{`G~R]G4g䓮?nq8ϷWoNT=s@})fF62z~vx|v=Ff8~e|4܏픡Za!&{{y-[ eyb~p(杼߻Uws8Vǝܾ*ɤAvĊ58}ֿczD;֓a~O4N]-;?~y w>ߙ6l%y+ʓ:aQjK;ڵ آJw|_~LJwo.?ͻӷ709Wl$}M)47nkXi(fWiהv]m}UUF g~Os?ur>#͢VR''J-SH8q Ak@SH5¿8!!kҮ-cR!Y݋ h)]`f : kS0I2$Icv=)rsX^LķŴ7OY\0LJIbkw'7F~P }JaԞ\YRer)Qm$"*NY{7xSOk_<=+DD%vfq(Tx& u>e)HIf .(,"u6pYZ 1AxF"URӮtR,I I`E}?-8fy^nrp^q^9hOHq9訹kpt/e D%Mo ~6G,_xU|j~~>&.~cm&[>Nsۛ?|oWڱfR1ET뢖LV:.zT:q&8`[cAƌ}-E[D| meGd}4ȓan2cjVPFۯkFgTI:?w7H`?ҹu}hd6;>loo&^?u1s 6xFp}2׽nPKTH' 0odŴΗu" :Iep28+'w Ay .`2q&N UJ-SpZ6r29{:sc#J$$2A3⩣Fk93%tb,gp!15ŽWa6P`&'&2=15ZArէ (& TshXaGL}NnGv da6,I')|}BDrbzm8M:"h@mx,7ΜEi`WfΜG=o[~Sߘ9" WHPu:TN)UM`UVPąµfa֤"; O=J(;kR)x i0F_JRTK\ ()VQ" D;MMis EvRYíN*daXL.,T(' {-Otw1D EKNR"8d F! BNYҙI,^2,EW@2> l 4e(S)t" N ڗ1,%nSazCQl gmv6›%XpI'Iz+ & ܴRDžâ Z6[KR9SIFS'zP!ǂAV R6*\3*ta.uhuVuLmFs̞q 7O@e///KSh@xCϝLOD >9J&%rSB p2ٳ\LjK%8 A1TBnGI LDv|L+]۪85q1X8q;t ]n0WRb??dS'-n Np"G=GB59_V?j^ y: 7H9[) <1T0n!7FUFYNu -7HXKH\X(A ‡lC5 "锨ăCDdeW0>7 ޢ\=&^s7ae~Bj3ry{B5;tf\>=ޫztemgB!g;[|0.hlg0ö~9b2r'Sz *ʒ-ͰMlYaoޞ2-@ً.?zKFOW ZA2QY"-3hY?r`8}RF I akD&"V3pA[$R(#.΂jo">aynuLlKnvի.grhuf \O͏+,]?FW,tRfI[ͭ4F% ŮW2uk ͜k3/ԆSn~ׯPMJ, 8 Q*5"4bi*  uF3kܥd)*ۥݥ83 '; $Rq]ؐ`6FoVKhl7[^Foi&oy3y$`TlYō6Ml?t[BV8eaKmIشJT2-$CDɤR2M4}˨S \QhYD!Rg-ךROmZ 1AxGU:J.%rc/Ha^7vAmnHf1x'lb4-m|U.[~,zRgw$=8pW#MEb* Ǒ%D21Se{nƚmM#k?75ecKI֛SsPZ8FT\r+u!S|GK$+a6 "߈T | HI !Jc)L!W" ,qx{}(͉r%M['~RL6F5[ҵ'4WƄ?bõ&bYvI* V6 QE&̋e}#,miec/ NMǟm!\D꨼^:*x=zE$@9ܨKL%.#`\h[A?{W6/{ZR 8d33,= ~u%(;=j%[eebج~X] iYm\D"Fu ]׍ͥ:?s9!P\:߫\\ޭp))ȗv< [E{. 7DH:G? hF)h$D܀cB|Ee RːFrAit;MB7ZQP#EE ݇ c%P('q E&W5eݩ AGgeMD%cR,9Ѕhi7(ߋ])ސDd3kmJY"B21gEx *%HWni"# ́  sOdq 3.l,2!vUķT-o-#.=vdʿ]ơ0>7srD$ӯU~&d_ /os Vys;䲸RŞn6 |p&sɛpD5:98;srt$#[̍986EsmSK-d%ol$; q3Rd7\ႌqr(NU, 14ن ī IJfӼ};%)F o\j/om4e_CJr@}S3zPaF Z_X]x1?m> n3# \2sfMxtrZ-7k(BQ|1J86֙@lm: guUadZ(\y'NW.[neuVFϺɶY[UqM9*.t$O,P|~ΪWNH\/`G~iq*|I~T|j楊P-A_a,ē0 p:K|T|CuV%J׷8ԗ~t~JCϏtL>ǟ>ǽO\, ܚ>@[L525ty͂2[ڐg2!1~@-.\/O}u4q>ƅ%KPN kx|"6qA[ & jqBB;JOqXXRRA4tYLTFK6T`D&}ZĔ!iϔHchvɋbJ&^~ޯw^|JӍ^f2>yrsxjܦ]wk_Gs'=MG;ܭ僢ݲӵ% _ere];u+!3A*!45C0AHv3lTϺ8r+yڤ(PyKh}xHIyDp^5@ ^ip}3>v\j{R-v_Mujw({ُe?Zfu5Y7 *F`]/JXvxK 9<7!wളwIu&h8HKU60)yH9&䙒M@N=IúFN2sW506clB 'FAZ~(}ǂ;;lex<{a V7i=Btjz9)V}-.R}Q(o\IK (8hP@|_W[= B[{mz ?x&CHEe?%QDgG50PFS:N6=Mr\x6Cŀ|=xeT]-J7})%.ŗ;^+8hR]KG.NK2R)&gDExe{-D R+P>?81 AvVJjý D(H6tj*@D 9| Kr&P992;Gb% [;ue\pl;- vx5\ {ӓLn3vdsWf-75l%~;fWnx,52x 04GeJV3%a=,GxIe) O7g::#AgǢUH#b~ayC˴%RH[[N,O֏ܟxכ[c̖k!f!l8sk9e }Hps  k}gZX?$QJYסE? Խ@Qמ &;K]͠Oܞn8Z+$ ALE B:D$)DF΢D4Nތߌ۪!l BŸ5L''I#"\G7["QƸ9EFϧU |z~3qIңsm"T_ Lʿ lz6(G']"qgxdǹp4IYGvdW- 9d5|'$u;9Tk 5StU6$"(v6~8ɭ~ל,uh~zZ7r}N+H^YƊdjH/׻ynOpz۫ZuR=x@]NV,F<`ZO$t24=JTn_wk/ų4 ^2ˉVV@ _,r ! lnjσ_3 1ܽ2m5f˜ɜX~B[:9sSSHi@FؽؒZgfϽ]8>"D. ݪL I[EbOrNy-}@}3Eմ^Kj{inM[P E]AvzvզNb_jb_-%;1\?rށiݦkKjdfⷐu2>$YInv"ٶ-d 6y{ !oqu_RuH67Ep#''GʫrB󳥾Q=a- S~,4Uk GM$qeqؤǻӅlmj>2K~2g[ux/[A DŽݔHY.+92:W[&ZDb +n+qD>oYF'.M|P&HS.!s>>KΩєYUJ>{>K(롃̭ovqKf%xl*zhݘ2򁎨h:Ss%![n^@N*rٙlUIl+=em hwQ/n> ]}bUfWb2m\nKFhoAsP/\ewmH{ \|) 3X`3 Y,y,;,˲d2mKvHb[SS㮊N싻"iS:wݕrj#w*"iaHJ);wݕ6-PkYޙ? Ų%+n'*6W=]}(hwYg5G-䦵R7_pz[XCx%%->*j0P.'07EOoV*&I4缲݄2<ǯ:;Џ(`t >4.e[NzN{wې7~+/x!zɐO}+$?@*ET]RhOBƗx6R{7.(r %[o*iLB9C!J gt rl^M cs=ui"Iғreó]w|\=D \ҳ5%MKn[IBT9X TJfgVpV)k$VQ:Bݗ&HB/:1!%43g*ICE0V&CTtpn"'CHZb[vBcPi&hw)wX7Q X¸33<ʪ?كeLKѽb/8皚61$ )p-c{-L2{/U)`vȢ?#* !mW,( BB0b);n'CpfH6 @z>u۩7 syyƅ'4mF|hl".t]*5ӰiʾB3?Pr0ɟwS.#m+LMz~88=;\U#$YW°F nt 9[\Yk4WΏT9`n*Bt?&G~j|8"8-..p.̩?̃JwlEj_.#<{`lH֎jHk2,*'2DLQ N,==ڿH֎*ݣnrݨkH棎 AHKLy[y9&{ehq0_z\ac>[(nZɹ/X_C+ W$ck+nU6prg'd$OG?~O??}8ч?qa>ѧ>\'z/)tuVh(~ߏl>4ZCxMw=}jqrFH(@^Z%@#d%2A.Pz̻4{^^tɕ EɶyaP04}$IYD,;sQKrp-TY٭Z:E[/}M /o⪮EJ. 0^|/eC;(u|+5IUȍ6K&$뛠AUH.5y^1rzpu@wlX!a겭GXfGs{z+&sMY i ^ ɢW`tT*9!ǤHQK{eHsEb ls?NZvHD-ngXaq{X&]篖y?dNs5ڵ`8tءBG Y.txfZ>.uVQ$%=4%Ԁ tPh-!=EM N_D.^mJ ~{X vz^oBe|\*?ҁٙ[A̩.ݍn<|<8TCd829ZKN"I+0fvT ࢷXCW{=Wנ_bfo1m¼PS.lS 00W4elF{y]HD!NoɈvmKY'ed$Ȓ"p]Id|D%RYN,RQ玙 ^If;COX~M.諞`'b//NfAt8ՕםV(4B39XE Hrȑ47aS.#R򦼠Vts$E>Q(CQX1%-=U]#mĹ߂WW`E^W9?7M편,_(0Q=ra Y7n,yIYMGuR>kTG`vz+Yŏo(C 5B,XRR/yJpL"jhMR&:ƜF$ f^Ɂ8CCwi)~n\6B2m%r`@=KIh 2=OY";G_묭^Fd+ -i=㫦"T6&ʂǔ֩RXZlTizږk Bb ̣yXGgw:iȶ֡*[zIqXp蔦=kZ QOf ] 8 ۣր$0_wZKk"u`k@ˬh4#5ĄD?ND94Y# x1793RpJ'#ؚ+"Ϳ G@=zkHnm\UۼqzzדK]RBX]a+:6;n~r1ȺW?y7 GW5Qb?wl_v8/_`p^wud9ׯ8rZhop=vVp iVFr٠C;&$ g4ZUr|GJ@Wû^SrzTۓ7{Lz!H%:!ABʊ֠3`O+ $e2&Ȥd<)dPL-PG @Bb&,AaT%AeF;t6mp9=XZ*ŅG|B e:C)JE"Ykc!Q(Qa2e$Ĺm2TrKY}5NÎ}k7 'ԃ@QȮN,BVK5x1IIYkmXا$"@r|p8 F_%z$MRA{CԊ)ryXii|SuUwW4}rTB2L4h(AH'$%(H,zјh 4~Hbi=JwǤ=՛w'I<  Ω跄Hs n0 T!?~:ڿF~&#ZӪs*[8PEOIgD!3Mҟ@&{dH.$W6F$6TYl;ϓyE/O )v/ A$|'}|$V)-ܟn$Mg "5CEM"I ֐KF:ˢ˳x\O57 ׵ͷZ'Mgg#Sxib9 WJ46tIqlgwfz?ϡ,Mu7?JQw{bzDD+ztl6ntOP[OMC~$MV{u0lvlMjZ37Z k{NeG?(ѭrbLwO1=)  L酝RPm[W@<]2`=eI'Ǧ)3[\>V;%TU:nTi=L *h^'KJх\rdJE\^I,anFC/mL٧/w$A<}PEL{Su1ptD xm=?N-JJ27OS'ذ {5kVZ'ο9cZ+#b:ągooQ5/]iaIptDU*L3y`NE9:׊e msJJMd3B1D H[h?OX *1nζyNz4mٰ|M 0jTlmնaLszS4(sg<&CI6jȮֆ77&䘱w 5~M ǧ9n+ tZ}b k8F?'qF& Gdu g<gi~y~nٽ*ӗI'[xx4]7mx?杼oF^{鯹azk"o'\ELrM,^jod8{A߿7_D\>mj%Xjsmw?,5cI1|ywn:4dde͆^yE:"U*bDX&XR{DL['Bq1pTѝ QP )K$E'cE18;#syMǣ&N~AmiX-a:"3Z4,xVt.:)2jl߶=@듞Eϧ> 0έyELFg-nGo~Gk{ү F=|yUnQ/>/|=v]o|j(@xPyqsϴȉVJ'6yc%%nIwRw+KTRy.<;R(L Z0ӆ{O) Wӊ*&B*"D B1fbb(&JWk}juqrI:ȇ'ܑS MҌ;p|skp7gU e*:t;-z&m;LvP/nݧx,ݴj(AP9T޺ mtYҰg\GŬ18p(I#:,xQd 8F,*yс(stTHG e)Cpq*@Y bpXg&hK%\,'\ IXK#ir]QVޡ\G,8kVTʓ$NYyE )CjBHΑ)5J2i q,tJ`U Jn)`Tif,g;3Uqa18 Ea.,,.[Vfw<<5` *;~WؒрcyϏ;cƟ|ؙ'bdRRo @SgywqiK%8 A1TI LD8Wt~Rp3 Êy)]L;emYeڝl o QYs.|KȎ˹&IX^-fOTʲpIHeӂzhR@/kB$^ȴh8"Gօ$V-̇lک_fbF,2*̈cĎV 1aLRQ2ƂV\n{B6F *TH85I*ʃ2ϸ༆(#u~3 bozOj2/@ W:Fݟ>N2TsW526:EM3u2d}c% doCB]8(t9}7DvV"qEP/:*vNoc[Vf돵:+jq*I .9BKO` :v48errg'Km ]`u ]cH 6Ҩ0 f6m?n\VOZS~7 ӓ )5+p-)8E'㒉Svpd\?08IkV߭fQV.pK'F2OEvxA2ʪi>*Jyce"C?orhg^쯕D_}u8?9=\S#$qW°hvwjEC4y])j6MYqLGH~R-y?XӲ`Ѽ\o?Yg7'̉c8Σbo5C"GorsGi505Α@mk9subyGi_w͇wCgSs SQ1i[ --24\rw\rǸ]:vx!mI:&آ']]FP %K+J2*d6 1 ܑ$JFG0qK 'dp-+Ҵ}:%3W.ZAk6&iM8&eٱZ͆b~y1բtmїLaz&ZBZ>Lr1r=]l: NMxr㼵 R &hP`U2n$vN5<4=tonuCdGn[ԚӃ2УGm;8k2tEː+,ژ~FG2 -rL8)/`ĸYd@O18Y!fK J8tI ~"FtE/h-.Vl:f߷YiZhYM|a>U':dղ ѵc\nVi6k$l`8tأgBGZHƫ#?,܊sq Y7n,y*ZMͽu~z߼OEx1[a*Y2fQj/VCFCIrʒԑŎ)y]c &jk O/0s6I@RԈs!kjMnWd>ju{:_yyCY^OTDNg!;Q0`pp:N9HzmI[+㑘EF/WZ4J\!:嘎hJK- ˤYJ 5DSBi?AfƊ}z[qBM^l_<2"yw=thIdDa㌂&Z+JNW=zB@YϞ U_zTf/G~0qy `RpLr5*M ؓeQl@T'L)Xފ%bKBb&,AaT%AeF;t6mp3ǼG:CKR(OȲRpΐRQwژEц,*Tn$$ J~'[R~̚r_sj#;jO鲼NkB&!3AL.;"b6oPpf&[)Kp (M.4*S=Bqf3{8qC2*9 J+(Ft332(#u]!'fYx@i2Bq/ȬS%2WƙզCz\~KLk(v]ď\(޺ X?7T?N?P3@MsZW ǫτWIoOFby}::[Ris1i'?l}׋[k8mR4؜N'\`_^.h )Kn&%'AN "y+4\3s睊rfvq] e[o}\ʏG#Ռ]$hMN>1%-I+0NvT ࢷXKN{NAO3U]`výcUgRat~H҉~"몦ɵ/)_Th)E@$o[ݫ4%%^&+=OVI,)+܅4K6KG^r,53 3eT**j2SC)ytrNI\vػ ЦgEuI:˭>[4fdKV`I"Qb!G`MAH>KyW}*GU6x' &HT!’č.i鉩rdSTg>N&ܕً6MrNѪfLb_MZ}5C1УjLt) L Y=T=^؊Jd4zx 3DBb2μl$L*jĀB͠O\6כZa{(~`a;&^5=WnנO?[|FK;v>~>=\IL:d(uRJz`Hzwqq)jTkr:s, \%“HBV« _` (3P p:VWHks2*:dʨ̲XU0TʥI!ǒAp J9 M F)cj_"`%v @(RWz>#$w(29*mﭕn wψ6nYc=stNf\x4F#)Kbu 2=\$&$r`V O3Ώ=<hm~~l&9 #M^֯oN+ϻ80ig?N&F/uAjx.ftm#WY:6fF0T=]toR]k}wl<6OVT&lSa)*۱AVwe];:6.{,n ;k(_2J_vqoکkfYS$jX6zC-7˝[yxtxK8^}V"WTNtͥj=dRu5uW S7Խt#>-ȶqI*½U/8vTBoړHgLHdj/o*D=k%Mqxi4CtgJ{ʍ_yå*< A[/ױƲxźjYd-L$d9\fUL5L>C)yww{夺=@r)uD/'Ed[K1r EKHWmM֊V߆Cz6[N* }tワKKø22\w;Qt>:k_si ͑:DyQvg^+hAkjpeGCH=6 G#A{}djFwRbWDAL2JxZ ):pz='E酂?>691G\ }j#eM*@x&%'VwcY$Ϙʽx[j0خj.K;}࿍YmM{o̿/e9l"j>/7/6.O'}x?ǣ{l}pvK|63>fgWk:/ baW-6qHg޼[ʣ7?^XG ="¥\xv.lMޘ6\k` +;w{(OXC[ٯ/Xwݽd {{ːv6`onR̯vn6bv޸l1 |oͿMiOz"ν~ilD20|qgUS:,vGǠP Ϊ֧GzuX:|+AF1MVˇJbOcWo_F*%:+B'J:̜.3栥WL<)鉤\eܖo<놢-eK %IEJ1*akΪ}%T)zQivSS. RZJFj詽B2%"QZ6j(6hdfvdUa0ϸ+XvXea6{N:3]yS'Ot//ŃO'/6gSF7;ۦuU{mJOl6B3tQ6%cMM9{J PA) PP-צ2َ\Oy]QFmC1[UM]SrυUv]~׉j.**i0tnpy9 f *:RUb* V0sBԯWJ;0>DwEDC" )1ƅ䛇(F6B$aY)-!p/M.; 4h7?+4j5{%!f~FpŪgW\=uV++'>#lͳn{OvW3+ \\1ɳn%\B3+|js VW] Wrҳ y4н_Jާ@dp{k@:{t&m\=Rw縨~a|KRi7{<<NN=9ByhK!r<,-Y VduXz_<Ґ^5ζ˙Z&f嵋 p-f/֟WqMTCɟI|Bm(*s+on215,7^|kk;7ˌ[:Fžwi}}L)hHhqҪbKt} JU'һ%ju}+,Ͽ6SG=X; > ?.Z1QGaͥM:+t%PvE)Kk F.2;~HKMcVŢW1br眬EScU|ww;-SxH]k-sFȏvDkEӊ ҷ:r>z֜%9b* LUદXj!`Lµ,=z _fB، \mcS6H_AZg={ţ5oo冤Z&YCېR:xvp-dstʒ&SDS0 #6")Wt4F&rh=ٻFIm=D<@0aK+ *Ac1ˏ%fmϖ[=:^焒(dS HL9i*kV55k~IUB tF,a¶VwF9hк1D(5ܓ ~D[o#:e pp#U>eB HQ_<$dJ0XR OAJP&x^ۡq1ۜ*Eg!Քc0VnY.!kb&NE2iYmYt3!QcERnSI2]0ΠVN+n~ؑ'FgbBxƩsXRN3q< L"ZX|KoJ|d,M΍@Kkcl5eiX e#+\kr4gG䑹XGN-*.7Wjਗ਼X:M2,#2"I4XWp #XD4"U騛M`[@@cng5dXmRXFok(`B]:$']T KjNC@xaGk]On Lh@iߨy":!΋ftkJxbGfQzVKQ c2UCpPRp 3GBsJAT:ʃA` Ɇ= p]BPHLEs&%BA1nXЖZcW㹇. PD뚋'"KURP[TDRj')@BY3Z(k(u;+aTfxZ4D:S< _.[YiTE>c( 8Yp# *SsXpB0@ߛ swngئ1¦V?4KOňζ]@ƢmF: ^#x:p4(&dž4d鳒QlĠt9$ZZ.]U}#MALicC|Pc!{<%EPiDiyM(|`K .LK>}BH$K[{#fP܆m@⥛}PE#UϳwNU2\fYP%g.$ SƚbvonEDw8%e-->I}}6^cYAxv9eA[s  ࡻ%_4tE`i6נk@)@2{F @nsh D;ܱ-:t 㥗.sI`SR'mP"CV N%r`M# Pne3|,V i9%| J娵O6FeSVVUK-ꪆٱ0!kT$0D J,=7KxZ@_C9'`V ]0B3MpPE R5\-V] &/)oEIl[TA6-}U;.CEY@#wYVK#q]VνfOagϟ~PKJN Ad@dO7v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';TQQrAd` L 3T︈*{], 4'PB;N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'l@9@t@(J*N Tw^[(v Yb'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v !@Zh(;y ; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@/ z'nn5ū׷mv^[ajJ%+tK W 2%Pkd%P3q> }M@Qt|(WG*Vdq Ujɸz1C/q"؛mirWijWI*yJ'J3}.|]]cܵ¾ؔZV0V%ܨmX z>5a|ثzS4%kU7\W *֛߬;׶@궰W wzyu)'.7]񡉪N>%{-hA/;)#c6\A׋oW77'~y}xc{oe]UR0Z4EumDQA2rOO]dKQh”lzCeS+cq,2b}lUlS[ś7!k!{̐2)z%4EYKQ_.W4A.8Y8cpel zVeajV|ShTp8$"Jl4L rx%6^bK!#{%rN[CW(8Z2Fx*BJ+TWsĕvKJ XBLvjW2h qnm% p @.TpjcW˸!pYcqՙ Fa ;Pbprӄpdpr+T/A]WƔ+Bq5\Ya3p+Ȭ]Zcr-`0+'+] Nf0Ma\W^h!\@'BR;jUJKsUjEWK@.<]Tpju q5C\Em]`GhFCW֫+Tix,q=N0W5^}^u'`.Dݭo/_b/^_{>_tq."*^Te(bL6u) *_.z]Wd'?ތ檻=n!Myk{e_u5t΅l 'U&"bEWꪓ1н {ád<ϲNυ`0rmozOEa. ;b;-Q(((oAF5\?Gxb] 9xw|o|2xGHQT#8ϲ1ޖ7/beEU;0EKi_uAmX‰oT;L$]z}_b/]w]P+d{=_uQ-^o,6@jhfuYjmDwݾ_zaփφ)g`0}ūwwݩ}-$X]Z4W!VЋ.R4.6wF-/~ǩ7mm. |~wݬ-y1xXW8-7Ϥ.-8]4Tّ+߂/iBo뻢.//spٟGЏio_dz}è{9ɅC!yMyxSޕ.}hf(_cglC޳9.㎺0~v2^MKM|H]}y=I/Ͼ[{f_VT.2-~HDRZܧ#:`:\znF{Vx:0h{O1g|#{PO>/u* =xUKSL((R?F GvLy˞\B /k' r&dir'&b"CQ o9-<8^ӾfH߱-LGno=/:SmӢNƬXlҮ?ɾ {r@Wy;0/""tQvJdcA}?#٣ؾ<\ޟq(art!?z|;٥S:VV~r.oK9Wz\7@M V"vcuW4V70*UT+]<_$G}C~ DŽ;*>f?`tOŦ\5n*0~.^XXmJ.כdOs]&{#3R]w΋9~Y虨A/;WpHo ,߮nnOjW ]M@oC:*Ƚx\JeW3ĕ*fГ1E06T\St<@q5C\Yӄpdpr-+TdB12f+L> MW(ב)QmtM>sĕ7J JkW(ʍ @:JB׮戫`LԔ @W(7ɮBf]JPW(( XBTzw)TM4v7irԤmdHS3krpW^Z2XjO%+\Zrq5C\)p!+k q*a\WbWFB2BSdLRWRsv5G\DIW(w Ij U2+'+]a"\ZesԞq5C\9炡v#7 7YB272}Wʻ"+=\xz?BwŸ: WT1#+;$Nq*s;@qu\EoT `O ʍd֮@mT&w\JK/W~`yg}*Ep~}Wir`Z9g0MenK>Wqup裲6itaN]LoY6ݒǙh]-uHrupji,)s{8șwK WJl|kX%6 dJlP 5v%6TK*p,@ PTpjUW2!J)eW(8D2Vx*BJ+T/0f+x6p# 72U27,c."  *KWVg_ JkWsĕFZJ@ PTpjuP嵫9GHQ 2*B&+Ti5jB(  zCW ;Pb* }Z3}1$تMirĸJS{\1\̭aGHU`\z)yu+ +k\rq5G\)p}dprOO|<\Zmrq5C\itp]\ ]i)W3ĕ֮@Q PTpjW2(#+|$+b*;P)%\hMԚ+T$j_W %J:BRuod@q5G\Dvf:GW6q*7sU"HW(ɮPn MFsTqbp>^ mZ`U JS;վ46ȸ:8Q&lZAāswW`⃹>L?m@;{$ Sg SM{w|폒ۭ"&;)Qƅ#2a$*ALr>'MبK!\`IhEZCWևq*e\W(k!\i$ jIWf]Jϸ#q BR2BJP+ qeMJ tA넥+T􊷇WΊh) O~X\) *;P[m=7 ؝;XB[T;)Iv5G\lp) HW6d+Pb\WF(mdYAz*B&+T% AR8O'U 'ήN|FkZ7UJq\~quH襳Svpr+Tk̬,R:QTpjr Ur18K\YT F2 U6+TjrAzI W.(%\\\Z|Ybo}1 *Mq]I*v5G\(hBAKgԞ|SB9*Fc"+|/Gʵd֮P~UFrp%^^(D#$^W st~҅etXy ~ +[W SUFlU4 {SG'Vi5X\ b;MԻْF94*!2Ė\bz%d I2fS:rq5G\iw-mH L|_l&{3 f0b.Îw~n=mɖ-ٝ ݤ"YE썺Br Bj`JuPB#u 6doKޠL-LzJPabW`F]!@a_Uv>vܱ6iԕR#u?JRE5uygAWQ])F'ud{2Z싺BjU2MU|R=RW~èe 䍺z0)A쑺Bͳ"0rue"ǫ+_ubp!a?ga>wArLw>];8X^z#= CU%K^.O#|{I%H<ߚ.YxBؤIDHV8&D4N0Tŋ=q,NyCHOJ;9]v3Ahtʢn`?z[^J-F'ffYarp:?"ŜEQ(o\"(8hP@|r~b.:A[ EYuJ2cH~88=wȗ8E%ht\u^Qe^ۑ?rr📓 #qd/Á"BXDx6@ d4^xO~?-tga|z.fW>;EJqF (97|,YɯP*eVcxv~b˶w ߀"Et-VTx?aܙ,ڧ&_l(.5B[􎏎l4N%I%Ga1 AYGюlR[R5Jo،jYaw4&v/z_\!)P([&1x1q]6 GVHfEy:鉶6Lhy]Ov7VS;![L +,!S.8fpsqt`򥷣{׊^g|}/E~ڹfZ:)|h:mw4JV**}4p㯳Ԡݔ2s\UkqTߙF=nKB@"F)Ed1齗!DBp,%IQ+aZ]qWz8DvN=O/1C\ 8!wTö`/(ZHT}l5smWZP%PF׉W8qLuc_a]۩NO ƏgԠp>2m+jJl|4e)Sca %Ш3\l2#:vQ8[_1NjZѴpdqԺD܊#,GĪFM8?TD >Ru1tQEnefo V??hU9vh 7\by^z[')T*%8a\NiZh֚P7@-l!z˙'3-l&Ѹ[ ]%6O״iqݤA֧uRu>^L DD^ Q4 p+U$ʩFaxv%Vt rqjcDXHsgՎseLp!`)f^9:3>q.Qp+аo=w^EZkE7]彳5˲Ԣ5i]2'&:/@R1ey )RHC*)MB7Z!LD!J.(^ *%C!n6{ab AG_oQ ,@D MWEt~agaQpSM)ސDd3kmJY"B2RO?A$IpRMLRj8A4%ܓ1ˣB8D\C(bèZ:s3s,rpūWrv'T'FYzWeZ 4 夈\) ðo}qJjTn\%eج$[UqZ78WC E('Oϔrhx:ׇ*1Ťl$Z@J8j=!Hwjr5]qB44sQ쎪i6;?larq!GuoKE2%7&핚ht?Z|T0RAShjaoRⳡ chBt0ZӤv#k*,x;O[gYΈ_iWs|nEjݟYwޝlr;Yד=Xźnn8ckYd(!v{̳-y۹Npg=cpk{e^\ڱ*ɬaAHX 7(sC_G(5jXiY?~y ߷Fe8?"u l2wC+L9 .'M֮%UDNg5v*nv/p?ϟޞǏ?}2}N?m_p]Fѣ:\K~4 =.Aϛw i\&]CL&:W2 ш!e[jy&VI}[|CGIDrG "ߏH8qԠ-ׂ YH5¿8!!k8LKQϿw) &´ XTFK6T`D&}ZĔ!i I/9ݤi6vj٭xbl%a-&iүa2k驻L>g|\qa ͰuUMdvBhWB !F<+0V;Qm$"*Nk vVjTg K˸ JԌSs= BXrZoM9V F_NNy49C`(UN+mȹeKcYV ڗvl@. (؁ٛD+MꆲQ|nQ,iyk^8r';L8HQ\ 4[N_$TsC Y H Qz"sN3]J["GKV$LB#ՠJ!HgIh :UIXkPvoU=nkF]ILPFM<8浊)A;FTi 1D VQxrh<#i#uK@O?wbճTv STQQO0"RDtjZ~iO簳ML96MKF u4I[['= ٵ7ű[Mu nZ&f3aHM52q˸xeFj!|xbAi5V_Diy{YBwEjcmjM3cLj#~!S+[ IA[KB\c _t!JY:k ٰz9rꆰJQV)&U^Bp $r4!I6yde4,7Y퇳lXY_audp'AD ι+YR16!pPMcp[\x,=467՗q3az̈́gt6CTYBUPh;tsq?Vq'LRƎZWal "XVk8ɥ4떾Zv?w{W5+lS|s{eۣÂiͭ"?;Am97ڌYn.ڥƶAZtKKz4&ߧ2g/ ;>^v 2"t@u%e|`1;u3A 4nH,8 Q 5"V4e` e7UG]~ `萅Ξ5AcpY? ]PyAroT/xT@"h H ǣ(R@.! `<dzr!`<J"D`J8NH RJ`jg%X79 Aznl $ g MxZĜS)Omȹ[5t_=9i+Eb<yzڷLMXB+y|4 fҵCRՍȃ mz[v.>׀3%QIsf])@[8g\P9H lpf3wg*JD#|)ʼc~ TFq'7$Lٴ@LwmXH|^L0< VG*9,*I)[v mźź< &N &2#@$\k1>*;q3;&vpI[ns :=g*]`výc)Q0_^75n}jWUTBV22i)MğN<[YS9Jq)Ju &\.C J&ğ<8%Hslc72em*{5\i׼,{YβgK nF:g4rHƣ&GudIh),ïPvȈwXj#|A*D]ҞidNT)tIr*yR:I1q8d2O]):VמOv1휢;xE,YcB&Z@)}jQ)o\7*8hP@|HxQ'c&ѩ3)cAyxDN!1$ %o# Ed9@-QӸe͢tNȤy1x٠F}Ηv3aO;{_,[>Ckxh 9ur#ywvqMK2c$! d)H!TB|rA@c)@R3\ko R&D"IP0UJS 433),yrT@ dZ_:8Kz w:JD~JNL.rw{klMժxgv͆!/XjdA`*1#(͉)Y͔`qOpʃO-Mhhoxl~zizz \ُE"$$p Ro,ӖJ#9o]oyX ׭e7Aw Ol97ì1\gbNx5L8!u!!DdP~74L&}3Y7Eǜ= ^4b<^*anmV 7Jh , <3%D$eM:^DsM8a{3~3~Lbi3[OgY4lsP)xAoqkпo>L ~4moT!kwF/ozD=>va'7t-~̢_Azw?yJHqlű;z$OnuZaׇ?%mp#Ia:e;uuA]vxGܘd|2J!qql<]|kyx-@cRV(HeH9?VԃNy.˵]ȻvyE!q&QI錶FP U[ 8әOّiѦO105¼RuTtޱٸf=Rveچ'ɏ粐hXϞΰM8l=fxW/DMZ-|{æ{Nm5(sӎrLËv/O} l~Z}}A1 J+WJR\hDAC@<Ah g6'S7L43:rכ<޹wm Q_a wU Έz2KfooQ5]] Ǖ yE#R-,"Tjd*'V29uOtBtNVaw Rge99QFGυV()5khtJ ъ:Sߌjہ*x#]Bc]&y/q%&M+ Y٨heJW^JYgZ)k$J…c J e!2m5s'5NR-Q;b,[*F NScp@ :g(@,J%YíN*db߷-qdصFYދ^\X˷ֱuqĥ-b@6nOy]ftݮ‡w=Λ>p9{cn SEN,]5x7(SxC+{uh]0^Ѽ)wl׻5wzi\"fyZ]u?t~ߘm{:nk5?-\{*Us˦nw?4` o̧VV^gn=7"n-0=F-6qG5qooDk,U)!gW &}{~= LigSp3iIE"(e,:O_PYWҙSk$HsLr3"{1 $1N#),w 7^ 5swBHsseɎͅkA]M\CRx*xʫB"1"\OXGªz^%bBXػGE}}-ˊp=-5O3Hi fI 1͋ND\㥣g@2> .h-DPS 5P2\D-&l a)R[>ì $Yqr.wyVuEg5K%ZQ *OVLh+g&mvUЂp*#9J"5J24=1W*$X1$R*&4R-c1q[rX-,BQʗh~qSu磺{ *;M[lh@xC>X_hvv.|rYLJ *e} p2ٳ,M.$LP &10q1tmod(XbqՖv`wN0oVyu>'ċL\$a x%"PD6Pe7ᒐzdȈ ) <*5!/YdKhB'(-l6~yR XbqET--bowbfL'DRk(`,h'jcT7 "BhJTHeIRQ k2"Q{LVQ9qJaXLb]F|b9bZr]4bowi|b¹A8Qt:iuo@JxKE/zx x.ve[BJU'0aC1 {?~z?7D5 2x'>zBy ߡM޽ʕl]%!~dv ' Ǐ,>L\`80iʼnaaR)8yH& G].z }m/X|>VBH࡚Ϧy<Ym`B> . x<7o 5ߣ5>OS^϶ۼ)z/'F iBzɕ{߽2AJ, 8 Q*5"V<%x5PhzM@H~ٹMze8!2jYB$qMg5&f ƒW7\jBn%> @=wAFJZh&/ÛoTpnX7 .@ eOߑچx+l8X<~FU\m-&܏ܩg4Fh _7RMpİtT4i͇}so?O<4/s2 l/ȼ}v32I{9O$a8YRan[šUd?jktRJ޹q$Y~NIy b ؋11ȫEjjR'͋X$%- ̃ȈsrSZ@ v@I()wFJXH-tϭ# yŌixl1ZPFUFK}hkᎌHfJd>~Էƽn mS1Jt{ 7 l}%cM!;ɆIQL*ʿ!B11XU6F 2c*LQNN!s/u2JEM?=90WYsՆύ:g,/X  ѠP)}ݜ A:^yr},h%|+̶6E5GIl1pרc;S\֤{urUfJu4~1`ی Uh3!%UTT5 Qj%!CRcuߺq)hK -Fb=i}Zm$Bb ɨ|W!'c#ZZalQ1K(#m6XXtdàtGlϝ>B.聾Pa:ir@,x&THd炍ns(c BAoQxiLh**u(:ݡ-PB Z 4BsX?Ҁ+ %eâiD)SA!W/P" bȋucHmlV%*1XҲ9wS^Òb׬jPrwUDF&ET6$  VS`\ҒXŨᠨVߕzij*6 Frg7d6Xm#d*zSJinF͐jPo]?w(c6o]D 1Kka:PѮ4HAjT}>:Lo-v%RꁑFBMHXeTs#w~0fxs/WwΎ]#D m3bBJc{*ui@.A/0w%sJٷEA]e ìc,Ohv=!-,,tA^xO )z&Q*L&zZ#t^S`X N惥9Z~Oy0|0u6B[!q:p /,\]XTBNu~t}:y'*֝SL@@vV+JHi`M0Kv߽v<.O֮ TG)X} ]+? KE6=R\w xC4*j](_!80Q#(v_I`ʠvo a^Kq[S Y)֎ i>XAҢ 5MFY&`-\haP+Bjs@S2@Eг.s'lp1:hKUef4& kEY 5L#)!6ϗ.X,tSZ4jM|BϮXd*BI>z- R DŽݥ^|!7q|q-n9>>*(֐5M)t5AXƵiRtMKYoT |?-OXouyݑO\͌n4ԐQYC'ǃ <2 rT?qi_ޝt0[!@Bv>vgB}*'yEF/3D!3kef2YzZ4eÉBS!Łb0&[t.P?ϲG.kh*Mi^! ;Wг7ؓB UHw BRAK-TRAK-TRAK-TRAK-TRAK-TRAK-TRAK-TRAK-TRAK-TRAK-TRAK-TRAKh)*v@pgCq>: Qj/$ЗH^! $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$ЗKH 碟 p{6$ ɓ@23HȦ@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B}$YD! pcz6$Y (K$ QH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH ! KQT>~ŗQo./]P8{[o?g{B HH$g9XU+7ur9:( ڋ9\gyVׯ_ulSWHú~yZ}mm>Aۓo׏ E nhmX?><9].$MJ_Lt3U Tc4(ɉzPFo9yߜa Vbj9^M;!ijIӇ}&h|y^k >Qc*Ԉz*>,!-N>>l8iN]!v|PZe$O(0ouNe=uMR :7ꆧ=>(}/k%?}xP:%eW\}nHCMYeW>&2,! g3M RHt]bG|?K6~Oìbzos/gӣ+/6BAn}q[7 |( /akk' 5gn_k9Nݍ/1~o9,#YQi}{6Gc~B|7O9c)}w] ]@oW/__MZu/\#i,ɍfO-yG:ʹqnk0/t9 ݝŐ74BOW*e,u;ѯN۵ yyGgw꫟]_OV+藋vwl~w|t'{/|O_g#7^Cr7oM,N.-]~a\ `xFz>y1WWWGq={>|r;s*[~|0|{qd^h_h_h.-y}]vo>I{&3+*<0kv {#?Bb&TzڬyjyU|wKJRRBtr!:St'3vEy2z':}H;];gmvtfH;_\Bubܭ3wu.OߜKs`=/0Y36/XoO~;HիBѫI:o[\e=(l}$:/ o[sV[楴Yے+_v1ҋrMz~ ĵ7J׆9{_{k|xzEM'?i[uK7kDۈex ^lsOqKK>0NF ˹ۣm?PgjZ<w.esü|<ŷ xʾ|vOw}R,s> R#؜Nw1Zc.ͷ-{9ݤֆig|5\tK ɥ¹&$7s߾9\sw<5J|4HNkL}9ެ&ٻ`|4ңm<8yqׯoגπ7޵6r$2H~8;#9/~JR$͡$kT $IQ-f]q5͚_UF5U~m`0I)Tu>]c*?zT4O]ThݣŸ?>L>}>}Y.B:fm[ ~F%qlݱAs:{&bɍkCpvz֙>U}+uCpwƨpX|ZdHV)K0<*)& {a XE{()Wz==GL;!][olZ?'4ՙ P 6aLj2MA0vRo'~0zɴj,JS*8eZTBA&`Q{QDq4iv9yؘb\>t`ևq 5@,N(G>KI!RQ EmXK[9rU /9+" (,|=C1R6L+1>F鸱R%S4z-B$ *I[ދǓc0tCC n 1%h3!RCŐ~!)-e@BlJ&Rk]>9}Oüֆxׯ|p;!EbwƵhr&PPM:OZ0Qy "dꘘ?hhں(כ*vń۞a|9nT]Q5q^ÎSNa[-߽wh|=]8߷TLvp8-[gk [ڰ)qp+!-0X~0s$B t2@}Yb$֘^c2 ty)d<vQVaUvª{|{8azpTVCׁTHaIqH5k^o˻;n%EY`}-3j0B@R$Z?jxz^-6z5(_?d&YhJO&(c'EڐHA2.)/WW_{o7c*})R>=.4r3A[kW'q[ fL#\}A0Mfc"D<6)(FT"w$ Fx|l9nbH)-53ԭY R0 nL06e&YXT1VʘND1"@f)vLprc wjOH1p.Št5f YX]jt{@ϐfjɠw֋lyUjK5ԼsU1h&"әYI)F~zrvi,{v)!+L\*aOgW=<U4h8*R,ё0qBcB)`$5O̓p݅!ٽ^=P9޼[­eN3VpĻ)^*]`ېHCcQA襾aA }+wE?JC&(PTڲd!TB/mNmks5.B)< g:)!6H$g)3Y 2&ӬWW㇤<{OwCLy4?P}塼;(*k{dք=L5ݽFxfw4DJuv/]+A֒[]7HVIƓIF-.w[{]~ ř>i?z-LѵѻðQ1KLW m^ |)P,+[t,bn'q1Ew͜MLӒ@znhi/nؒx4u8ssfr˝ox㳝=9HʚeD&zPB S[,k(s>H#9!" 8 2+%1@Y9KTc4liIL}eZ%KzsY}TgA&;=ߊA £UG[ovdP30y?ϔ~Li)3v a]ҴÑFO5'+$˩\ #P\8-[LWafI)TsZ6 KIn]IuJ (1&d潆L+Mx. Mg7Ah#m}{.le_ ,[سy+AA5}Ӎ~@PRD}m2HװZQ g؃ R H4/1,  c+BQJ$>(`43b ~-ЉSt9 TUI!]A2 MI"E XQ&LA r#$YPέ>2mAwan}J40go@"0--x{_Փ`TAH[ů꬀#sF:Z'*r X|dzC]usyXoKGz ]RBKkR2D'RȣĽ"&e~hc_lLʎwWgޓ^L';;׏bֱ;|^ iP8Իz(LJT874*xV.iZ;) :hcZ]RךKy?> e!Zg@%L"gQu\JJfڔ ($F3DI]rgW3UXl:{wʟMoڎ8mcnz궽=ra",Z=Kڵb;[<4 fw^Of-pqg.޽Inڼ& ή8B3CF=rvLkNv>|;D0[^ݝ߹=x,dhA|xt߹|fSob/ۉ;(~߄'g|o' /Z[omnbz- wԓZ RBɷ]f`ίVcT[j'is4imp4򐽱<*L`{S։(%&RfJ2bwy<$q tnVol;hT9^6ϱ@u@?PI[eo[zulWŘϕ]Ln:7M=(0X-&kEpj2{#B`c1XKyUwR)朤I:REf,1~R")ͱϒa 8@qs}4w%E]~uːS\Ё(ZIF[<#H2QjA99ò$1Ë jD[㕧 $ ې$ >Kbr 攄i%7FP%XؽnAMmz L{1PìZLI !/B>,+CpG G_6wuD Lя(?B8*sDH*s|H-bErRj uLF6C{VGH#sm\᪥bԂؕbl4N%_r+Dtؠ#hGS6)m Z 6qʮө[8.+[6C4!we`D.z*rR(2&2(UĂ3)RN4^ UVO-+(E޷J}kͭقܺqw"]g2d?".֚},deY&!JPh($DQ@3/÷J),b "th2)orM1*x=zEIrjQ"pٸj/B3ƈ ƃjǹ2D&µ!ڢ)J5#gI+kM;j/SC08_dtMo/& o-?-qљr= 7yy!WHf*ZhL2@^ 8ƑlR:^׸lOl{-q!ȔB<(WIi 6 ъR- B\d P ND=V;B=LQo0Ks\.ArDQJ` Ef5< hhBB1(\|egaeWniA .:*7$Q+Za'kH:L.aP2ITA*+H8!*$XlNp^2= >8 ~Tu8օjL=帎c3p3?f5\ڸqb~9Ȟ8jxg5I48䇲0 O{}_㟝ۮPBGއ[>7hE/m.n>C\إPN)=Ts;P cNMjI䍣P(&Wqp]H sQ2ϳy`C mP}ͩdpd&4ѫW{ӽKm>`<~ow\߂Ϻv8ǵ(\As@}nit3~_Nzxf7s [eV\¾*ɬaxS dg>|+5Gwi3GDRs5^omoC .+]B//E ӯ48FΆ_?ymQ쾮J]Ƿ;=B!o?:xw>PO_ZQB$\AzM)47mkXi(ɀ/Ӯ)x]mUITF XG.B=&X/tM!2B$3|ZI:C FZ ;c$:'pT!JNWK0Nb>0T3xc%>'&YY8I.C.áNJJ#$:F9Qx~po߸8 6x:, q!V ={GoBҎNZb_qڧI~|ird!l L{5'L<  ̻[:T6 ^u&dg~h_zi'M[PA= >/ug 4c3߻ 8u4fPцѵcƛIP&ׂ556M390Pgv]Ff)!Xh` ʦm5Wur, t290ʥxJvS&%5Ai͒|uL*/h<}An1Ȝ[1DZ?M@G,Ӳ6C~ ˩ HM䝐XO8?owb߱?ok 'g{nt7>L2Y®A˺XI`Osjφf7;9&h*&ӾwsF&9GA4z ^ʑ0zٛ+Gv; 蕣g)~عw?m^XLݔdYfnyk$J…c )\F&"V3 |5NA gA\OZC(<7N!~ͻ;Uug|xjaӈO߷=vڑk1b+s4 WT(_S:ՊP 5tС:TCjP C/*tB4b-KZg"ř \`ouPLњ%dGDVےqEZu"yH,E PE$jjcVP5b>Z%$5@uRQVOѕˎ8Ӧ56Z',[s,Xj[l ('P֗-U&(ku ^:-!46pMN':4Ҵ^Qea)FckNa!$B3QIMjeAYFmi%r² aA[KSSz>lw.$*i'fˑ#) NZ墌"PR_wa{MyNϜQl[F$͐sgcysHkꝫ"kZ!V"k*BdY+DVkCFVVVY+D Bdݡ"kZ!V_+D BdY+D BdY+D BdY+D BdY+D BdY+D 5tCwl, {Y&oN70Ǽrt<[[EotԜmް>/Msn U=nSZN+kruXݜߎ\jW?8?Bw9doF\^ /PءݦmNwoꋦ~<7%~aH1f,1}r%\U6ېbnCAD=nnޥBgT5/dgq_'M9oAB:E Ot !iQ+&}({3e+h)9 ()xB8#!pp‚HM=1Z k Aznl $ rR (Q9SRXa[1rvni$up%ߓ5x OWa,Pۤg}kws"[AmQ!3ALI?pivIrv9F޶/% 4g%DsƵJ lTqfřgn)4T8"%F#Ie q%Ed~ TFq'_ge&PlZ @7Hc5q牖4$Oqf1rP18ןs % 9g7,O UEOU+:1>^U.REׇKTwuCuc[/n]gnQjn$0K۵X*ճ>nlu^)Ƕ>=fpT9C1>F;AZR0IC21)#knu:5 g-Ll1q&30R͟,PCwWM Bж22iPIS=x54jD4sRR$L4j9 <]"+|$BxE=5Ub%{ʽwz:!vYw<=k{]7v|hhη[f?Yhi0# DIgdb7jwQ=vUI))tBk{NF*IF Z%5 T0iGSHdJCU;)4㥄ۿҢ[r>d}=ϯ5~W7z]Wu$^uޮϱ[+,fj1n`X]@?:G/:^uj{6L^Iko\*t&aDUI64.b Ev8|V~{nj}AwZ+U d|NXV8[C|4r],*aW*tlCQRJ~;pZc%GޫʲWirƪo-:He|5nU,2]mkxS13j| ^U <+IT(9cTTt`XbO"j|;u*~0ZLT 8|l ]0jul cFg}^X̶3U,7ru-_xneP| èDIo}q*'_0IZ9ʽP}tCCA sR") )QÊZ[c!As*Y||4_h!k/xjG lR)tnŴ_DU'6߯b7KRR22Ӄ'IהdMeހ=="v$u,cJx&5)0T(hQUlxa9&%Zg2Z!RVepLBoc딦fN+3M/% Q]ae tM0L7wT';zŨ2Z0J9c`R @a1ga6&LJ<7>uIqU(&i"EMFHJ (ϜQloyQJ+dCΒV= ( nT ZY tc?k6 PCYSgdpĀuҩS&X+e%4A8#'hW@N?ui8OKtIYr~6[i҅O 3?Z@jFl8U_O yN2gSTJTTh-HBϥ5VO8^+x&)U4QZ2A$],gSZJ6DJTq?6y[[̚=?8_3/wpd̅!}Jl:VrĖ4a0'2XI S\L PL=.X'_;YȾށ_"eX#A;&Ф_5y5k"QO S%RF'`gyYH'mtL{ 6jD T9SxVH+|k,ъV E%:cVu, K Ϧ5N[WätHڏ{.9cv6Su%ٺ 'aث)F{76^JX-f8PVTYo5 "QbȀȖhLuS)Ad"<"T[> ^00S0XUr PR(:#z @E)00*"LuH9YKiZ.H$BNQACP'0!fό$U a~d&xWۀnWKG" )/9xLVAR(R4 M)EG.Zad@$'i<ӳiQ虐QITϨ Ϯ9?:+ m/2xE^)#Äa6 K'3"ҵ4R &rRAi\^ҡF.j [I8tkm)I kIE-ۦjDFGl qm QȌQ,*5VҚ5vJwAW"܃_oN%4f[b@{6Oy"C@\{(7??Ѐ傠 = C=d{S|oXP0M$>$c x4k3фZ7GEƠ,h\̠ 9m,[Cƺ/w]1.wRzMnҍo|mR}q'EaC,E%mb[LZL3 <S(yW9Y|6xR&o`{Ts^oe|#wo?㛏fOφ쨓٧werb²A`㤦D`!MRxy!)yjFRFS}geY'++ȃc}2IDSlml١ \V 9 $(t9R`R.q)uIҗKfQ V>Fy?yIGsmjQ`J/噒ؓ12 ☎TnQ3x7AbFFY_)ײ`FMLƎl8;T*LKiM_V3;7 $i~vdq<%W;mlg!TֳCOCrrm#A/[xMQy8yI4`eIۆZ4h TP`-vڐ-)IRR1*-6T9Q)mNFDA526vdlUaa3 1℅~Hx٧ʌG=/W6\~YƳr_4l<;^=s6J& cr:8E%-ƨpc *SAMLUll+6`'%tu+ljc` Y- rEZg[vybP56ֱ/jƨm&Ԟ{9>b6l}d1dlTED$3d,Ҙp1JΖ&eʊaſ!LD< g;Az# b3mcD"Nx+HVBL1r TYlQI*lfI!d-]1VdA3!HMRg#iV{+!1"6,3?\KZg3+]c\ '\U^#¾j6`dUWfi@٢c:θ]+?@Ǝ&ǟ9=g|ձo`I1{H1}7-tM;>'=BJf|/=;#ìoC-/I_|]=ٲ ?Qۨ|8Gt9mOpB2G 7eřg+{_g࣍VL%h~>щ{W<,{#4ǗofGwIFh ٚc ȓ !˻`$)VNgvZٹ]}fg\rT_׋غ7aQ0"0YL^x>_vg۸/we .piqFtz4Ij%,m5]q,-gÙt1ΦYvz&yO{\LmqH`% 2 IxOu }٬yo/< {aKiQ`J;- ' GU 2T$R RƂN,tfMcr{3QBe8|#HfEy'9 1[:#gGkY_N㏇K)ߧL~V fu;}|:hAo4+HP`FhIHp)j&iib+5\r|% (hKL˼]oug]Qf ]0kQp㏫v=pR|A(~(ƣT2yr=hH{1BAWݩ·J"ˏ0Xg$̶IFQ.9\I(Pz0b,$j <:cYzt.>E>|NYNoAi8Nn̝~1WZW}woe'şYJ+/7_85__i:+;$ؠ"XǴV@9PbRZǣÇE.&rLh̋4j$ W,t}:>pNO'ܧCn>pt}:>pNO'ܧ t}:߿~ 7Z*{^EW߫U*{^EWqĺ+?vmY:PX :hJ*[W"cf"1"\V}F%X=i^aUF6(;W%6DkҨT(:*D}Xͮj죁Wr0_kYBF EKNR6LGΓ)J&R0(EYtRbNj Q+ 6 2)Ck T:"wP;#/]Κ;ҡu'C>=dr G,8kVTʓ$nY΢ Ztr%J!$H(Q5J2=k 8l :%*I% Tג3radag,BG"<Xf;}p>ul':g4xC>X`4G.|r*U5OÐ=1βRy NBPP#$&v|L+Mȹ_b(lG+Wjˎv`G'^x\<yw%!]U} :L8Rms"Y`o]Bސ(i%.TڭOb"wڥQ(:z.|64 <XCQC4B5x` )6('.ĵ29?y|qg6 C%], 'X6r%ݛ{~shaԂ2݁[9azy[ASr/Oy{`vm}Wi[kϻ+b X=öazKP K!F Ѳ-h94x7Z1VaM@:$\8@leDL[ (xqj)gXIw6ViWL jQo-jy<ও3 ̪ t6-84"y7[vbjC<E+)Hp&I{Fz5Qqw?Zh)"qY|W,NS/fA<5!  >\T'm!P%u.΀bHVܣR@@CI -#[xel״+}QU { =2NC(_kv^k\uMCrբ~ڃ;^Dr,?|3Vڱ[Dn%m6ƥ;ԕ՚UZk\n)klQ߱te]z{8=l?Ѭ*Ē%l3vvުy]ʗhvyv~[kn7w my.+eA6ȳl&57EWh3s_^/?j3rZazK3Yڤ}8f}?H\v7˵֨yJP(([$DSb̋vY?jԞ=UJEK9cTEv/C1!hWdTx' {&bIrjQ9"p VȅfB)jcAwxpV[8QȄQ Mе/age%M&Jt#xvdKRp봠-y-P7?;/'lYF鬪WpMt^ƼddD3JAsP4& r/f­ub?ƶdZ&.CRH,$m@x WD"@GACcPȸq뜺M4aX-֝@$qClnMD%"Ç p= Єpehi7(߳΃sSmM,ࢣ{C)Ϭ)fy DJ= PP I~!u9eD  sOd|q jԆΡuP&oa,n~~{ei؉*haM LwbMhpC-1(I?Y>ʋa6>,fOϋۦIesI3on"xWqCYpk/jTQw(B9a'rr$;+7t2:CS.5kI䍣96P(&WSJWO8 ^>G癋JN|o6Pb?TS1 YӲJί&_9Ј) JaMOh>竮MW\5Sثwŵ>-Kul^Ï_Ճ/6mbd+b( r  żr}uiGK %-fXk3f6 .GJF#x4 lj09VjɶV[&Vgـ5>O}z1Cl >';+5`$NU-3D|jUl^y+L< Eh^)pӫJ_t9ǺQ%.?_}͗_~2}_y%yG[In5W4].{]5]vݿlW ш!y,M#]GғQdI-S@8qԠ~-ׂ 9H50!!684¥VϯdthoypNq`.llLFTcȵ)C$1^rIQӟtr3X"Ct{E, GXշż}OwYGK΍Ng;`U<;?JdyHW׍4//?4|.VlK}s3##P7yr=Y@qB)%KT*u#Swq=8#Ccqqޓ?NQ$ŠPAqDYA׈THʓM0IEYo y%M{?91Z';i9I `4'"܂` F 0>x¡፲VGBcHAic ׄHtCɧ 2uy.s!rTʛP&eĠ,#.mi%rc,+/͑LQePRg5~1‰sJ1Ăٻ6W!H}vHA A_%)Re{TILJܲ(y ȔfMMOUWujsI5|M.x?ynո|fɭdߎE_aPwڞ7gc CؾǬj)WÃ|ɬ\+0DɎqJs/aw2Tkv;  Ԣs= Sؠ}-S7ѵݝSNƀEK )hP3ʃʞta_"\cx֌[6yPwjPlSe}4kQXk *01FR661 .EJZqѢOMp"E[ˏ}`k ⾊݀G뙜.'M'hS|'|NNpt0`]:L|*cuS.f59UYꜵE^AAdwm-7O^lGLHK+mрZ?MZ$dsʪ-R,Be^U墬72ªV /c㨅J&/ GmYJGHL $Q{H p~aN[ "kCؗʓCa%G4vݏĤfKDoS)6Z"Ce1p* RmMNxtǜQ*G|N ` HSQ(tD ]Tcfa_Iw=#{׾;b:-Xk1Bg("(umF'uZ [H+\| %d'(#q3g:AP%d"(cT+%~ƤI1δVI׶Fxڑ2$1ZAi0.$`Bdm4/?{5=~Y>gIxW%-7*2(HRB'cO24 m" =%q.9$!b%($b'DLRRŚPH20;20$*etm(Kj'粵=dT3@6A`P%B@;;f8A@0KKRmSV aU)+ٷҭNvdz珼 (t}|:աkO/{OEonQ+aw:)4Ҭ[Lr>d} =FOonzj\o*'otoޯ˟s}fxv~7 _AwT?\=g6+_)Oذ ?J_/W~T81t;5とU[kWUbH%ˏg#EݼWo~:xOKtŢ~"[|ȟ*7[4Ә+಴8g6g8ez4{,'ݠ[*tlBQR*?8h-$[0#Ҽ4xtiƒCSWy}zƷowҒMAv욳aRx9)ՄhFV]NJϫZWϫԁN>AqvVTu-5:*aTL^$^S5؛O6giwc3m&ծ*:1%mLҚ @ր!BEB(lya9&%Zg2Z6dJY1 Z!MعVO4Av5T n0ͺr3(>+FeIs*zos"-)Ac08mL$1jn3o 0$mbb?_Z(&i"EMFHJ (Q,omBzQJ+dAΒz>BP/hnT ْaPw5cgG=fэ'?ޗ=Ek|&!H6  JXHl0N:H!Ll`EPU \_3˿D] ,/f+mQ0bG+]BhZ͠V<~'}Moel&R)*QJTTbo-HBյVد^ƭd2#]騀GiaH ˴tMEj)x>A,mԶv;"y旋b_%cir4a0'2 ֑!.&r(mFh`/$ځMЏF?)A ~ү\5>{!)'뀄1`t&ʒhl2TgiQ@'mtU{o%l"8Ql*֟@DEbP3SVRxFR@1 \$ le㣙تdְ(SEG~LNf `y<;_}8ʕ]/ry8V&ZW 5eeJz[{t(hddK4&ՑK U.I,B}  Ïd)Do,Z:(xc ) MrI" 0U*5*E$9UڠE+dʠWwA rZgC1h6[Ζc_&M/ȳJn3vTmگ.pMM/}xLVAR(R M)E1{.Zaˤ$RIϏhz:c6٪Ƴ``̏x]{ZURy!diby*\qRqh6g/^)SA^_ ) J>mD bֵHQi֭]ɭR[BԲƩԂݱ2@E:݅<>V6Fz,]x?wwBn ̆BK[æ%[7bʆ̞;Q[UZ9vG3xTdoO q۪XTO3'SZ 0gJʺ!é*ԋOx^K'3"ҵ`&| ;^\OҡFX.j [$bUj^U$)x~l}g1b(ޕq$W)bQdcƫb<"ɶnムflI%@U],B0Wm̵!]5rt7g;_yүͦCհY/i;s3AnT}[?m~-TAH䡵L˖lo5g JN=S xjOb# >ɹx EI2eJ:!ƔW잴wP=O2`,%$(uѐAF6L̹IR- e,g$WKͧӧs'͢yٸɁqNOp`Y5# JV @;Z l1+ +TK]Ut?LA 3}]>^tYwlq2ڋ fVǥM֦6cZ>D[Bf<~}a9D8uOZ;|T+ea]buZs4&j#KvJG|b]2,w)!b Pgr&s9x'SAnЕn5rvtX상ޫe]tĻ_ƁˆAj:E$%RF>^ A4 ̵JUC&dȁ(ѐIȢ(lIG> Ds>NiQǾTֈfЈF܊WҙL(t䔒* \(px;2ƭs:5HFt3 H6\&"=RP#mI 0Q5b5rv8qٺ!:qɾzU֋vЋ^jM*L.A% R5 KN4^| x(w jAQyYwF?mO8rF%@7d?>SGLBjr69uMU|z-aDA$'!UF(l-r-DڠkP`ِ)QbDyMdcfgͻY;۾b/dYʕNdg֣(UL%-I&8y.AB霹,Ǹ&dȯ̩޷l܆6Y7ϧx݆~ZNl& 긙}V#cGMs۴ȝ޴}~n1~;Bmz9~mM%]X߹Wc~ X,=*;=duh؀ qhً'ZOa.LV@Έ 1gfu,5A(ND>JKDj&UݣJ("QM׻(LR$І][4X2jXQ^7]-y\d2=f4nlDxJDNjOF K\ћ"I3Zy ~٩wB=C/|ϣ63a2jYɸw~( ٌ\*؇AMC* n~ kJc~X_p©eOd5+WQHVoWdTtDe%»B(1g7iw sL{::_t8b~m_[n^Q;r>C/KKm\لDw^k/|qk<֒G7moqYnwW웺[+ck{ۖ'_\wWdWx~kz^E6/se%Lpkuï79ծcEge '}{#f-l nH:PLCt_?o]\Y+E[@ʗZ_mX|1v^Gk3O32h+>A;B [4.ΔNƘ;g@Ppȅ`I6HP'B!sr,NŬFΎs6`l^^ؾB(dRhgdb^_{}-B[rcZ7iY2c)LdZdtɛE8`DkOߌ];~ )*~mNxqĕҎCgV5kvfj dY⊣M6f_QrDeu]<W6l $r2y )f6!fK J ijs7N>^w%wy,'x5ߓ궼*^;+1lJ)~8gBG8'y RL\j-bkɭELCL m%yOZgQY`2\ddUTN^q#-a>y%ˬ-@p18oGv8#ZL# Xz,dDiK1&=> u1@Gt{, gcX睴il(oS*2[H5r+ Ca cePc1R{Ct)ogf$05qF&2YB`_3?xJЌlֶY`` ,!!<R%b^%Bt-9mdg(ѹm!0L9e_SUO';5A6:T0LY- [9A'NوV~ze-{p@rE?{FJcp@fn6ECpbv?vOGYm'}-~Ilml棪+QlW'b|̾&'0T0`]L)r9 9_RqOUHoc=%{4wա؝ m`WP}_x<۹*`6K7i=2B*sEPL]a ڢt< ""C#\5goݹ44_Iw!Ȼe*SGQl'C' Fl=1 b1;]ggs=KGx8t.1ɳ\`5>Ap38 YB:;IH- W 4I#IZJm@z7/]PY$tt ZA*0.$ BhT `Q xCm=/ޭ2עoǿ[S)y`56 K(h7 >Uŭ}8]N?~|r ]\]puK7秼ig>%|pwqr-g.!9iJ u;i,igOT4)15 gǷz]`{:@;0c(zO mڃIkY-k Ab=3^v^?OX54Ip'*"VYr(… Ϙ$V*tyLQQ먡ד!f%{ν4.:q} [S/TNL̟ZF_w{S,o,7li49 "P\$N%Hk2NګG't 3`gdxGDDMQD{޺sY@gqٔAsً잙.n+,Kɍy)ق{J2("O|APP,vH2[ºGK I7IXI F$'DLBPIH20;4Q3Ƌx"/.yo?\7WhaѮy7'o4o>,_sluQ;V6mޙ? >&}NmWzJE푏ݭad7S8{fIfO4$])W!UV"E/?j5Y$5_DZiܴ󗽎Ҽ_&:Ro$ ,{Ng'Tf4r2n:o|d4WԵ<j~su 87NaH]o7}h:@Pp~Y5[KQv(8͑tN%zm/@Rnvr!\]HQI6^{Lv^;l*.f}ґ*X..R ](~\TWX{Ja>$0|FX+0ʯ]VRc=ڋ'A]߯&=Xx K#FeW8gnIS26Ӯl17S r&6ixWA>z5W@`B rQb1UEsECGܮo4Z}㒒t:9| IU`bJ{b7*KGO`ny Jo tif fd  بxq(w&قW6x)?z^৞QjEpLZ^ rJJAp! Ϧ^X|53$[6]HtO (-ͰKWv1]F}mӗX՟.IVJ]lxjA@[1ף\Ntq*/Р%@Hhն*N΁ 1%IBΩtk /ڧ^iR\xIS%Z$J%]6 9E%cg{C X;}79m O[\#O*:A-n&S}sSx nо(+/IJH<SGitR,%TŐfR8$ӏΎ?AlT=yu$eVǓV Oa~WY, ^\H!0h^4O>+\NT:ű̚K95T+Q 6| dē&C[ Mw0o"bn? ֩;-Q԰:^CW^4Z֪ǯ\C,rȔTg6Ц˼]DR2G Q jj>^c%1xԏJh%JMH|4;?4͓T]T2MM>4VTn6Ѭ.9~hR&)ZQ|\|Dq;s_lWhZ7|{O {}66}yCmhڪ?k YܢӖ;=?n~68|Q\_ X-"3Y{uC ;x ]X]tovlBrnX;h]Qu_lEٛK6i;]P𮝳 jZ Zv$;\0k'8Nn{rYܼlSt>+䒸r8,52 O9MP"hgsO44QlW8kr2>h0g"Ibdzqø~6.߄t}˳vQ\CyOm~״G-"~˥\u[S톴{'Xv-.`w.̷˫1d}i3Zd6i-=?7s3d]Oj{5k)3UWfQZ֟?Jo?ϻ^,0ܱcEj"`3>d_ Up1Z,(X":; ^?hU"?"_NV+ gJ@0TX|IQ؇wÓ1ȫ.*χ ףrzon@+4)Q "X_XnXj(!3$pe.ŝj|g¨j%Ъ%ۘ&,&}7UzH |[1a/7Abf&=3F/ct/AW[cJ}έ]$Kϱ_RK.<7sPǷhY{Q֋iìTbw"U;?_'2ֆYM޸d\r|u/xay ^ɭiҒ㽫Rn?JYxONv˳͎;͇w1nsP-O##/Ϭ zv .njMp4*mA]Ig0_#J't>9MM^db󥝾Aɰ8O|{?{t߭>twgt/#OWwxr*_{qv)79ڻdu>ALbV9Hኛnnx08j-Q!Uʛ;rGt$yͫբG3wR2nEq?GҢ0ؒ2V6&]5YYU.GMflw!IWd&ӕak'ǿe9{F?>>^uDG>jycI.)BD]=6^䘬õ3e+@6*ad1X^5rFEKtLEk*PVYrT贫V^T}c'XgK4i?JqSZ[KCU![%k`DiL ͨw{u6;?jZtm`c߽ȿR֖v[!%梁m0YZlˍ4 cDLµ4=]:&TgƮf`nf :#M1E{) wLţ%D>W 4YVOHŬJ; =8IVh4IUxPILNYdȿKo1rihDթNmv ur>< @QDsϤ.m'MQFK{q#YǬ\І 9X\b|jŻQ$EUn4NTRZ<'JIIbE1\2'%d >zoa|XkQJNiB(( E?kW|0 \gmG;催a$UR:xH,&զlR\=@@QcOI=i],>YЇF.\>E=uk,R rSt [w$ACLڳRH:)TG)&A եfQ#2L3RQ/3ލ ju]uk Q[ݫ N6` Jtruԁn#Xe"MNil5V-P|պL$E˳ǚڈ.hv3fB\ZhhhXWoT\s [(t1BCBk QYjXkY*=bEeջV V#LI2f,I1?[-V-t*x4XYi)̨&4: qJcB`P&׀R߄Lb0 DeΦɨRȠE $e5̀j@o]\d܀rW6 ) ) Ex&P;stg-JQC(]e֜G51<& 0tbG.)bD*p`Rp 3k'D1nW9 i+XGeұK \V;%a7j,X \I{(l NjCҽ.ٕ(Şn5VkKMJZc8PZMi>_kyoqrFKn{1/Ef̺(NcB*&6/ A; B mB0|nm;/ bqΆVm֍7\p v ^ mzZ$ >:%‘Fi:Ҭd :JrtwFB+Xe ,Äc*(Op(v5k=ȗXP p΀d. iՠ(}w7aqcMhL}辬HKd:B[q;*p /d,XN5~T}%y E*ΊfG6Z/kI|ǍU6}{Wp]:YHև\dPe}4 \{#BKtP"Х)G;=Ť<$Jxҗ9t IgmDT2{(%'6ܖS5Y:`ر-Px+P(vnj5BbFP,F,pZ{6/(Ф pd͎.9vǍEQpILMF"h4 <(@"pwFdUU%o0CIaXYvc;&gو.PB@ƖtĪM@k>Zl?Hc%͐NڳF& hH `f%݂ ZMުh{9ExGoR I.am WTFDns A*WRw]Pz`*1J[$=|"2@z zD-ecW,BuE8ij)ڥOn Q"gO:|W ^^2P(EcDMnƂGtvy\:\GJFdU8FтۦwZDnf ǚ&EHIjeݼLFL@Zv!;tsrgA^ QєD oZajlJ؝5z Y[0m@IVTNtjiP/zCLօr4S"qfd98QN=tzc I}6ՔskYNH&"ˡꘅG@+IFiI3Dۆ*NWi=Ѿn1xIi@ŷW3ܗ{`{MIB?z4I@0xOn@^t@#HѪ+D%GJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%З2@H)) Pz:J CKtOF kWv&J/Q d]|'+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V V9ZbxJJ QNd@VGưKTb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUEiod1)`pxBJ Z|:J( ^ +e%NFY J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@r@W+qzw3jJ^no꯾vb/fk(_@\}Z, 4yһN3L┋Ym5f}`~:p=8܅ƥqz=򦕷u;~H*kQc|MCI!lL/ֽJÌ9,H-VeV(gǧiL pYoc/z0%#&E7a6Z& e[/K,-.2NOCڥEzwqm;$D4mφ?ӇĜVNqMsȕ&94i:`]" [x(=%=F.aN[k2C*֧>d#P mXf4Fs~B9V- mj0ZT kA% JToݰeti[Sf BZ݂yej`mu]6j;5٬Hb_dzԼt ě=$G^\d?1g =Q@%%&.1WE8"B8K6sL9ny"F[Zd=7;gsrV nnrx3^H+0-_ԫk 7H)愈 )z߆A}|%g柃O`@3 xU6 :b3 ;kFO]}iblٿ'[BvGUفa[v{{!Ǹ8krf?鯃0*ᲡX%W*HZlU咦Ù»&ۀ# t0'/.!?@ͯDL@d)Ǐ.jERS+eh/rN9yp21ʧ̃ўI#- ,Q`(8{mHg#ȑҳ,$--g 58^g2Z&;#^P7AJک}oe=5ä:z2>|TESx:?]LD*J%D"&O4z%"(洮()6?Gl1>I:52K-/8I0>M-/|g~K<q…2c$m OӰ,R#s)j!6eDMɝl~yubqO8 t錯aZBߏA} 07ߎ5 ؍1>ݝTd[khp7ۛERx67榗.kϦWuHk{ ULwU#f{'6ߐ]??]< `=W_.~h Y5ںۆ^˛w7$ʝa<nio~u{c<8JfM;n0Y{ Sc`j~Ɨ3Xˢ4TԎV}iH<.@6ҐRepY*7 `Ȁ dykb"ALPG,#1 +֤~RX](73IAr 8Jȋ31v ̺R OM嘌$7{mfL"3W Xs%D:cy cp+g*wF\FsY!e1E+efY{`AWNYtP𪮦5ީ)(x^2h?=.q;ޠ%<^(0ے]mJ,OS|X(:&T%,+"Z{`ًac(+` wGǖq}T`IN1s.gqM1Di A ]҂ ) !PcDHxI >(̶$1DRT 4qE2#Cr!G^/σ-" =vbgh_/,4;]p/FӴ7荁kx޿&]71LF.?äϿ5̯1 7;{ .6lj8`ZAwN03'N2A儝ɹ]&<$]ۓIUsm(z!ɻE%DY\C)]oa@ckp( (>l1:b馎P~JI3J0ִW{7oNWՏ*!'ayPГetÁ^kjnȢ DLs%Ϸq'r#a~?,w?{I"(zw?ߵ ]s#Z|fז|~ۯd*.8 ;{.#$%֩yQjo<G &>H<8`BE Ib_u<sR1z&PiF)ӑ'q`Q)3m dK6(^7kju":ǕSJ'^;_ΉC/xaU2;̱1n(qJ_6TROK?εI\YgLTJV>Wy%02i>v'[3J;zd,3:SYDRLiѐH%M&fy92Ŭ琤QeqRa .DomTDaj@)_(RIwrmXr5Y9qYnor*t`!99vБ>RβEǷ B~*Q׍l8VBF@JE;5k[r]1RJO`PLV O bNYHuKD/.)BZy;qÕO0|yLJ #|n7G7'hu!r'y,32(T)!'%qE|މۏ=c7w3bǪغOg' ڴ.=v΢.Pix/ 3#"m ML@䅄J 分r~ZՂ*{Dw:*|TA eN𼫠#ʁծN_;[Jx2u.=[W#3',%L> K@;xJ<ș.l[<ȽoQ6צ筬ƁyWw: !g0*nVPvLs5ԌGvs_jUL*s__R&0ӆϣc-7_Yy?ޔyӼy.B8}:wMvnrh;~vh0n+iGW.QoM{V,qxȬQ׸6Y(~pd5?IKZsL}@3)_Љd“𾙾O-_:u6޺BE ∀167.|49X|yzai{д`pN}t`s %d](@^x% L_1"#6ٹlE.ȍVU4z^nmWJmLị:eIY(LjQ ZGybiz#ﲯ2׫6$bZ.VVUO^E]UdfZK&-Yn#o%F^6{J}S fTo՗WEX&*m)*Cb`,[< 9e bLUN^oFcx @oD1+ ,1FrB-Mպs?,>t( M*=l߅ʣՂJPD/kDB^ʮ.s&E̫I;g3ʇT%&DG[n5m (GC<_B%N(t`{Qs1-o94B m'EN>&2K742nY!0̘)I2gee;֝ ln$w(*]ޗ9E]B&1k02RLHpe0yo5MQoয়縲Q޵h#9dBfZ*rUFCjP+ <ϓq~:&xN1B X!MJS8 U4RVJ2 KKjǍGDZ_uLj c\G GH#)ˤdѳtAj1RdѠzk7#B}+輥=㫦"T6M* RFѪr>R`+Z `RiU7"SqUD!uoa|y_lku8 iZkpr$h jvMyPebCB6Ł;~-q:kV؍"c3kguk]̭`ֺL[{0bo(寶!WO nzVKB\UPr-./G>}ԔM7j.NҨ9v%^PDCIMs4ȝ.u~n~/Fk`Ww0V/cN/95|'uyvڥ~]fvO]k`#'BZF'@Z4G4ga«6s(޷Y$тd-gۀ(JV+75ӥ4y狲*.LIOx~Еs^NvYx_EWIIrljYLW꜂^y@։){,Y/l=ƒxF;)h6ʔZZРZ Z62eTfYX1Wʅu"IE˦S@aH8 ;ύ R Rd܅pպAm}V{͚ctΧlEjiw)־Gtyy[O'/8nu F@& 8/<9;F +/ea)s\Ɛ5y& ȁw{RD:Uw_Njf$ٶjKalq>urvIն&ն!HLn MiI,ni?tQݼFN6әF$2VX'RڍD5b-ryمs.ǧvn:A^Wʲ=&t/hځt0F.xE֒*L?O҇82v֦4,ߧw ̔ wa]zGi޵Kf=2oy'](dp>7չw)FlALfWojXF0#͌w.@LMpډ(i}F4y~NN)kItDY~VdY-Bixf{frʑI\X9yC B٘鏴:pk9ep>(nVFI1/o7N% waY \d#]lI݄ ndxrɲ~J+qWuV7Nμۙ抸p}3>{(cȷOkBBև9۽sB 4< O/a8"?~ۃ>M.~?{wG]O+{ne`]_ z`o{-5Dk ki2.wk?͔(jm3}?n:rp<:\9{wWQkOq i3G拘|V8u&/%]֪KLxr48=+q:f- GG4A|-`{f]xwFѱ͍vX9HN>YHz 3 ȍz)nj9drQ2,%7[J[B/G{@^c~ϾLǽM"D] &P `K9Zm{>RVp"BrZ͙0)Y{ A  =R%Ck/dZQAې@ƚ̬^0`zW'N/'"tW[^/_5@f*q_f!BCGTn#b&a輨ҶWHϡKT!:eF$t΄q֢9q/z1E ѽ~@5,|خc "|>|ٟ(֨Ł<AխKD /8~۲/&qəyLV@) L.z\fq$Bܗa~~3y5nD*$G_E/)nɔ=lItLUTWWyh񦄦@sٚRSːsB E TE\(e,elP`3q&rd+AUr+o hƙQ`NZ%|@lJwx?я4xB˷7}|<=Ssdt|1g~I> urS .Ӌ;Z8T'doX$+ɄqYfA52$)-`vz,)dt(Xssb鬿Q+/{c2P5.+N8iQ,h* | aUe%x-E/Gfr+{}ȆQYN{z9_8ư%_[6*lr;VqDwy; ˧-h"][.fw3UU8ۧk%.xO͆eªH=׼K%z९Gzo[BmYJםJk gsvYWgw˳2yXyO*>x?-|0Nw\o}'+γC_o\x9^{ ϵaTs}Sw_f7>K?Ά>ϰujzf}HIxIiAHZN~:4ŁNQR`z!Hd)gXܡŅC)KX\6$o(ʤH:E.:aLRdRm67t\uB׸d;8.f:LVP>˺J(J "SNFr(^iT`@rk[56YӋyݝBu_1EL pɲsfĴ$Ln.yԡs1S]~տВG:䑽diOXsԬ1uY&'6)]\lu"U#3G<{y>ػ1콒ۢˠ>,~:jVEecS FR#7$<[[68A%G[>-P3x(-[@ ?YJ9*)pNii0cvScCm&.;݌a)S[9P߷[ov)OyQẦ^ ,} KPDw҂Dm)dÓh@]*0q ]I /25*Z*v=vE6.*Ŝ|)]NVZ4 ;֞8{vX/lB>yM."-&O/8oxz:p:/pmL Ҕꚟ5%^Y- U\^P^LJhDUu:S!NPP.o]~8{lN5y(^vmcm= ؝Ŝ2E EG`!TDA7[DLFIEZ6Wu ̐J`M:;5) *+O#( b*&zς8ÞԯWJc<L?GtG'}oCR$䐹TnO6ӎ%DKޣ[_2EoWܐоê2ov@b6H:I~sR'W:_(PN.9о|OMͶ,3U[3NmI+ ʗl AQ99i\rKL*wR2YVِBDmC"iUsJ>`ѣֆVfXL.ȒHp3q+{J2'~Ō~X~X Q\! 1J<~_JJzjWLN\YS" b)8M`>"e\>C,$9bО)s"%S ȐKf@XpWۮrH"DuB (wQl0>01[ T]^o0<3́gSMJ/>Z : &(,$)S𮯦u>USsRBtz6pk\g7( c<OP> 쪔U'[Z͏sNor ^thP0D^Btv%?ܞ|ѹb%`A!*b&G̨P`(XXj@-ѹu1uXォt#m,%)I9BI*flJٺ(0,0631yʃ61y^\ %jF_~m{ ]t0FˣQ> "G`*PF"c̵q$w륔q%?17qM޿Z'*eSN8}|9:#Y=%~\^^:^_4ʉ tǷdmO76˿\\qRkl]A^sݸ٣5ƿa.|bqup00g񤜎ߟ,WZUmEuOdpO Ķl #mFmF+u 1ɴQtrz~1gŎc:G]Qޫ^lFםym#u`tFs_eT#H˭w^4\=iqjvoLҟeNzš/a0SQUIMgg=_Z]cKj׭Z]x<< 7}■}wRw7s,2tuU"h~Çjho=𐡡]C -oy˸qutzCuެϑnG$x<)dz$mI ̏tݹ)R_bP <k!Mޑ& o-&UmQ8jQ#ڻ\ B%bC 'ðd]簒wZO{u9quӛjH<;<>sJvGof= N&m<+9f eN%(TA{LD'@GٓM6a[ݞ䅫ks Vvځ0(` 8|nP𵇟q]MT AF&De@iH?H^G Y2:PJd+))ɨeyE-!M#CI%j}|>^V77bqIv(ubq.yY/sTfyβ=g.%kb.X5k uSllQƬ! 3 I,g$kwCVG6Buwfow/glە9bkX۳D^10,5hJUEnu-7[)I$)lVeAR%b5 솺j >)pxl@8ᓑ|{=HmRRQLg$-?{WFJ#8 ͵ws2>ڂeI[ldYV۲ԎeDVSE6YU|X$RP`1dRoH =I&XA'k76(k./Q\b ]SP6b{]5~ Ԏ^ ߘ`q+SB%> J:\\(`oDDE3VgE\40<)/^P(+6{WUE׹O/}WT9U񱠄KU,¤F1N ?ۚ5o߽ݐ}Sx%oo?/_/7Я'+jU 셝6W{VOtw6>>#~ɳ 1Zqmj.!42Ɇ~.,6{Ub+S&zA7^N̓G^Tr%^ fb,\&3c3|t9*j/psav!Em@ Q憡i!|.j@pnmJ0+9h6.8!<\2br+Wl/UN2_|gw!)6\pN[pb%ۮt$bkFivEJ,I%kƍrcW\C8G&uҲy6YH=*A4Aw"Kz2!m!A(=\N@WJF'Gc{HNľ¿wW \v{T認SF,\ZT+UYsهS ;_aFpt/&#S`<)>j,iW>V61:c"ô<"ZCtēi bMKz-8sy,&0mı6L@^?E5ՉP 'a8@8VgPG:_'} AyLv*]2:xF2ijS* \KҎj;Nq1J'^Ivw=?PCXƭ A,FZ:<%cgu Xq|U@Wb|Ⱦ':6]23c|>a+TDRD"ĕD0zbdx3B5J:)ۿN vsRo2>O-+{SSgNf@~Ÿ=WlmEqzw3!-|8W&EeMqv =ħ8yC峿L9 Ϧt{o5Hc(.ߩ$.O"y!Mya/.Bv7߀~-ׂ W/t[v_X?|8_"i3Գڇ'5,˭HyXY'?ףDaEOŠfٸZXAO? ˙j,{aޓ08)+j 12/?NF*ُ!#Spqdم1EFY(@ZZF*R)<1ΔI`NG8h9GqY 2Q2 Y V(!g]:@B8L'z^Cz6b ףkJ0)tc #J 09 .#Q6e2:\Fׂ(`Z:NH\8iLkeg%XF 2ΥH餍I$NQ3⩣|bi(Mxb%9%D]Я$Qr~TɷNNzA-_p2T >zLxsr5.bgmu]Wɉ.HYjo\3 VSYrB{N)G6-|izOr_Rt}IOk2^R,_0>)AU&!nOڤ'c{^nYhe q:_*|X ]ȠUbzZ T ƙDLIՎJ,YA:Ysv c\{M >ͤ ?ہfrmV,ql-§eS=/5jyUgWx,5*xD`:1#)U)Y`Z2øiNyIg O7g2:ȣWǢWJ$,tQ63jQXJ#9o]y:<[?zqwAw=NjzC1J!eTKM9L: XpmC B<'eT;#Ey~6h]G!(Pnn FvpـBۙ?Wlix!! "G@yۻǻ}C!b!ꨄT9wq$)DˢAD2NLptG&8@}xVʎ6ҟ_$/$ř 5s?qѠ'?OBiTs*HrOj頶Ig>ϿMvZ$~B/QMri\6& ElI664vPa7DOK%w/ "QSi}sq{[7m,4O&?Z>`iIJ U@\-+٢m:ny}8kLڻTq1 {f  (n`B RK:R#PA>w\]g!Jܫ-DJ]OZxZIC2yX tܥJ&jd̄}F:<)P,9Z4eVs ,ς6H"?܁Y8QȤ()OtH&GH\)&v#b܃;kfKp+6`tIN D:|Zr1XÁ$JBYwl 5Rpho,d,Da9d2Pqn NP* A[ЖkqI\)CJ{"cR<̢ϛ_ޫ7H>2`"T'pJA^f@pp p)+|BDJz JJ\adwS:SvS;tǾkZl]{3e7s֟6^'&-!HRؔJ)S Nԙ>:s9/űDlw8.9BKO+*GG6i-ws5*oT>խωkmHఀFGuw&6w:A>$OkTDʲU(RDc 93ǯQtGR9\ h8 k@tIH(8g͉K F! Xw*MI/џqw&]Odt)D&dSZ_+YY?1ےQ3uY64AVsԻbeq._,$.L&,s,(<og NZc +E;@,iC8;TJmx-WH|NV͉ǂd?%Չ3J2NLKqrq(B̺ keI#YM8|,́AdXLhj BOOB}ѮEE<2X 'sReL>f8, KYbBAE=#e;el2PZ#pżbp]-Y ч-7@!85:;ԄR[lzS3K跥~'4o`xלJbi3>8M$IoSR]EEo_äc߮g:K&+ 'kȊ 4' f^R-NPK&ORrddJBx? m;?%DY9X]`,yNf0.7˻ϴ K1/TFYJ~\]Lq(wss:A2_%Z׋ouիC\ JU4Ms8#4yFTH:}:LŽF_;7gg~N|6=xs]."p9ci0lmXK"gErqi4#Z?aaL,oc;E+8+x4XŘ􆇓| v󨋇4g5rIɓN )+"ǧ]لWn'fom BF2XP+j٢}@ڥr2 iXewّ|?IKJkd\FF0t U9Б`ͣe.G?*# C5m!Vxj'@Y'Q`;G ZFy7 UO5Q&p7(<4KMօȝ8Q3]%ȠO{Isܺiju+2[u+tغ"k4Q:%Qz/ 3٣ @2xDL^HȮ^Ӫ,W&gu$a*8t6{2299^Ψ#aý9tn8x<͛H-=3jpS vEoY>ei?LExc;㕒hy`3]Rg`yh{ߢ$ܙmæOvWcwT !g0*nVdP$EKO*`Y:ɁepFxڑe?'"S$* A+QD $kMBEg65M9^JLACtZya#O`9fdyQӦOg'( Qil{ӎf2iڟn4>YfRi6Τ2~|M7aJ' /Gi [޷뚽LwcS "d<xguv^ojx0؝M7^Wi:5|ב+鋻\|Q+oMX D/qY.bP,k~9JS\sH}@it:O% |N$0C:r5tLu%K~ ∀)=c>67.y6 r8KOɿ-gLۋtE {zy$2W]vgzGf*Y|wK B2x2 *&- e I'異$ }"qy} %N چK#^-VӉ.Nj3ؗh H*؅2.jT#F_G5>öj!QiW#7y;z;מU\p=^Ō OIQR MfJ^e!$zQ] _{ FE,YxBBk $TL&39hw^tÀS BIV/ff:|xd)"7[U!1gyg_;)yVL:)"K13aYI(W!#CΖ3W̐9mTc;c;ތFh^~ՖY!k2=dnE_DU -WkNT.#I,XR-y]&h4qטL'BY!H` y6j4U]Bv #2R=k%Tv@l|*ߨ(/2I$\"8gRDɼ q&=|^K]biBTQpd=Vv8糃|*%N(t`{Qs1~DouJ94B m'BNx DV 3dJ$2UΖ|{]*J@tzW|F,Mbj 3I`:e,5)M'5Y{_ rAy{+q][6X I,d&ER^-Qe=ԪtIL:Ka<'@gkb /S^gtZ+}t1ҥŴFحjhU .#mt@#$ȑzeYJ MC 2hPdkg=J94ҋ U#2TfE6mt)hUɳ&؊آTi":> hb ٣{,X'w;Q]CU5:A&E7,)AoD%P%d5Ey@Q5S(s1u(iiUͯh*s Yד{a^==wg{C'WsQNs8<[g-0[^RW^W=z"xQL kTxfD Ŷ丷Q8z2@J 狲*CiKQ\gV+gr]zKo7dž伇U Ժ1=^WqK.!^n,/qZbgZ0.n#h r҆~Cj~FUVjV=Ca.Ok`>6ݼau%6B8mx9gGKЎ\bZ˘(!`Vy@:Q :zt^9Ʒ52yGde۫ɫEj_Sێoml6 ~c+Č4rTd5SƝ$"Mhh.,y~noAܑ̕y|G U`([~s>Nw 8hg/|0bw_e,٠qila_.5}5z*?' Y4*н%rǟ6FS3l]"Nj!/ZreҾ4&b㮊Qw.-IWBUW' F#!]&5I)k{qxms5Iioʲ49]?L-(9~(λSK}Wuhw.T7ΧCXv LMhUU޲ERzwXl\szlxۢu[Ma&O1@mhoo*2a֘K]7UdUzdﶼh/lx=VYQs#Jd'5Դ#9bƳ(_H1MF  R XN/nPiT))"Yg!T蚬Uw<6`=eI'fэ-S.[GEN ϕe@/KOY.&h\wn:>GqM ³ǪWiCQ>ϓzuB' SYʎp\⬂X\JXFVLk^%rֱ$GM$ Z<螕w4gEp#\(ZeCBuҨR5*@ nLiY7& .mx~yn6'WL&S>Ez26)~Z~ A1 J+WJR gшГcH< &Al$ȧ$=^:p̓ĴGpJ8/1ȹRR!!P#p\ 1q6y# n U/$%]EMdjw7q 'Or0D)^xm@Y &\@ C2eoo$9Uۛ&XFQj绷`Vbע^>o{ ژ7"(.5Gtts9WZ8A2|"hai) w#S9L9̩{Emכ(h^Cy%(at\`RxFLqM(xzz>eJH{Ozl؎]_?eZ{~.Ia[v{={!$pM_/)i=^I/hFE+w g$zu^gI?6H9%1PWL RYxtйhĽV -)zb1J&ijb 䂷 )K$EFImrpk Y(Xs A š툒pX29V#Z$R(#.΂:: VA~ yӜo%\-=OCtrz0.nE= *{xͮ\/Ԙ0 rfCE8N T "l$8>"! ]kQxm;ɌӠ67(P\Aw+POHnnr(y•f%|/F3o)B{hf hpԖ{ ^Paowlu^_0̼i!v{pBgOt4BO9ܧ'<ܿ_\gwɰ|>m0q| wx.em6 z_9~wjۆA @yvyM&[[>-.G@-k˳|Gmz=㢹]wq}寿>+f\=O^ ]-+`8V*,WWNO O5U6Zj* <m=ӟŝKvA8R,. Lӆ{O) '*&B*"D BNU,ϭ՝5=-f`g*5 xu.9Bmi'ܑS MҌ;޶Tx;euo`1G.^3=,v]_Fwk.Nl%bBapsa(TWөtuP֡rJ urQN(2>Ay8Zߕ/~%>&nT 4,˭D0Ŧ}S#@6`!l\/'`p3nruWPĴY65U@X-C x1< RRr@d94(Ny?Lr?89!NBekͬ=j9U|H,YfJ|ڥ,wI✻ !5(6 44%pv=piaĎIJ~}x/ GsONA-ltgV~f_#Q},kjY~}:`{r Rj[G{xJ5.Hz 5$Y˵pAb"&ygLDTSb2OMFѮe|7[n{ڍS$#/:}=zk!KqCiUOvҪ 9yԃ$񨎭T/DeS0F 'dT* HmR DxU xKHhO}3 xHIyDp$C03Q Ũߤ#wsHH4RMK1c\$y+`$SiGR3[\~r~=];)>ޞ4lóE.ޞ闞>U]'0X|FSenV~mMdIq_+jYi[;#&*fBώ:;wlowYy3Oq̻\5-Fww 7#cJ][A'[+˄4) j`Fưq<4\>rRT$B &N$ŐV{)E72]pwJp]h_tۇJ۷4F6WUn{n'>_Li&e/-D+gHD2%1&RA.;4Eҿa/*(|J@J q: QPP`1dRoH$$5sEΥC 1)G@(Hdx(w{Y#&! E3(w#3U,8(* I'Ǥ9q?V$[*g9ۜVuG5;S\uϣܓZ-O|E|6^ ǻ YHWg=+O6U3;ǯ}wve/X; ҷumJF^PY;R8$ώ$h}&9cV2`Oխj}:AT"/NCfU^Nj B&X[}Y:,g[;1. q?B)Փ>c:TWz>^\ ~U|{1a [g|!ͧ ;bk@WqxVȎER&4nv8!LhCtfc:.QMO~=E܎HAZf( 1sD,٦Yi~׏ߺZWvҒ5-y? gjP =`fWB;EbM,C5rCӼ5[tiMrMn-'fZOP{Z{Boⶇ1tU*XogZ!^"PV)@ h3q0HxN&bI3^:^W`9ccR#Q1DN]rXB4ȳ9D%ttlloеGޯZ%ݯC[yjnŰ_BUʀ*q,d] vQEiH(КCWv6ON Dٗ7X&HBswWZE08J(`̥$JBPa9-x Fb)G"Z3~iA0p*ͩ_ڥXwنSE/{x@etzfMƮ*G+ʓ⃈^R,=eZpw(`yF2hn1ͭ|5<( \<g'p$*Ou#ѥD%Rxp+Ɗ%r8yBQ#s8$’`gI F A5'zV;;YW_I}vu2;&xPyБLj %"h`5'Ӑ(:@Ix- ˿Y*M` 80ZuT'B-2⫥&A- PCXƭ A,FZy)p<O,h`Aʭ`?.j|͡(_1WDa[;_Q0Fs!(lKE,uZ@$B\(IjPrqj?_v}CAϛ)AUް,53Bf!E$c2v@8_m6\hӽE|A$Gs4M1bssGD'u +I^YA2$! -D JJ9VR jp/-`Z& E Z1@#L34LrrewjGs֗N[;;w `ŻA$OwF6,=»u^^c-1|xӮ Yŋsj!jw檎 켢%qS;kV89l>.CWǿ 8hf~o/鷕?fX#*-fgxyznL瓑_-""<t4,.QZ"6ds//ې>znh>UghCtgY_o!b[:zi ȍIOG!wi!n4{1fancotngs$krre^UV`]O'cQi8tMe4x@/]w!CCnz6n>M ۜm6{)Rb.JYnl,t=4WesO==?i֝>q:ylQG=ov,6mv{ҵlͱ;|.8 PW~J|OeX&즼]4+1"S鉽Кw U"SLpO‰D8PB,1 w^RV\,7!_Pm[ovf5$)K:}&MѺ1eeiiVѐakSbuD>я0_I!Jzc3s4~b95K!\@!Ԝ ;- ukuJsbN˺}r))\Zύ#)NlDRBHψGDa'&e58KuJBXV[B<Ay;,@aC5g_ӿ(M gN!H9ZD#AF42QX!G3CJ"ݞm˅h4(-HMr fd} y^LS+Y4 wL(#Dh`H+59oFB^2!_ꢪ.i;o qW Lqzv>ۣGmÊ0I{g>9?Oz>W)@X BֵՈXgDcDE =,Š=bBXKyªFT6h7;פQCҨuQi],>B[mJ&Pcprа8C[S3(OHrXXwv$[M++dYtiؗ/oLĘWtFrH"<(y*dRIŋ*jD㕣 5 %6%B sbU@P$2&yuhyAe,&'rx)|63%)o(=`ߛo{=ӬEٻ6$Ugp/^Nn Nf a[Ywy)rdXgU0RuJ 2ik8bYK$U$46(Pi+;;,H!WRBrhaQ 8e(zbN*THƱ`c)U1H*TM(^2UzqƶP, A>zyj,#l}r̮ ;Ť{J{q1x1Lr- h?& 3i-U{'gЪII)~ pg#Ylr'!(fJhm:N>&P_䬗n4 K&;ڲg-R`;N0oVyu>g5ċp:#@0C}RKB*BFdHhϣBY"ED2 /XXFm7rʨowJ`l+UQ$A"ދ W\ dhʵp 0yB1J'D "B^%єtQ&IEyPR3.8!ʈ%{(nϜ%bolp:{mY.\<{Kq1Z N:: RJVqo%rhKEWrq_aoܱ<'){<|&Wsґk_6YQT&H5hIPLNFIfKĈɃ0Yx/z)Bg+#T't™21 s*C4.)J}18av8)~dw*}t>|=g:U>o66JP!Ф&2` E tJT@A!T"2hW `Zf 6V!yy<._PIyQ@:,\֕Q'XmJwoeWWF[/FVݧug8T>;d@9xTYeMnfؖ2lPD>AoCKFO ZB2QfMW^;`؏-@DV%ؠ"XǴV@9Z^1eRZǣÛE.&rLhDOl$štDIp,=29@QZ$R(#.΂ڹ16VHkQog-jA<Iz&`X&ط *{dqVNwdz T8N T"$Iip. u;Q"qI|W4l x_,(&s!c7j+~.~ ;Jbq=gpn͢d{UȟAtДmTTDo3 [Ǵ+}Qe {*:O*Pn6촶j\uBZtO{f}ԃh@eỜwϡ%Wz!6|u- "%m2¥;ԕW65+׸܆S/qVO=Mݬghީعvl_{y>jo^z~~FjF7g5/$YuvcͶfQũ6-~ٿ~73OZa'ikW^q||cw\kgMB1%[μ=WJJmIߣzVtH3OuxdrbVzM.VAw/gQ(>" PN-7*W0B$>i/B31RHp;m<8v(Cd(Tg.(CS0$K9~KӽI2+[*]J)U d4. ~nom LwL*#3T:Wn+&:/@R\cybx22RUBјdȽp#;酪_u0ϟyKmU᳉ːDRAi4 IhE-.(+@\ht/{z ,3ExJ` E"z`Aĥ *(`'asMRpQɽ!Zɔgڔ 1㈒~Cy1L}}ϳNsI=w;ejh>TQ@vʠ"vtEΔrd'x.&SG”ͽ-IqzB1jr5tNn9<;\L>|t6!PŕI3J8O*Itrr<:^TD# H3p7z lkm}>86C]qOPP}j.dz{ ?Vn;Φͅ.1291`.gr% ٴ }qn@HWK:[ˮfXg3f-.JF-f{4nzhsp99Vն]vUqMN?k$7,}}~l+|Lh1=+_jXK굜+~}v^&>'nU^Ye&"_)pFʆ_߼tv_W%r/uvWHޞ~ߞϯN)e_^}-?LQ$Ii'܁;Mߋ6oH_Mc{m\&MCn|vM]hjC!1v@hE lhAvZ>V߾P *n|)ٴ2\YRi%RtQIET] ,lvKܳiDmhDɤB4 TШ Q"&14в_9.o؀ hH_$с>i4X Qu]Q/r+)ֻ.B nä?sڻew~GzS.Z9e֦R.nx@GO)'yhq׍B%g.rQAR'o| >2E!_B32ڗ-iC8EX w HԈTqOVѭOY,/ٷmѯdra3d}cs+Kq6RďE=5}B?z9PDh(/T:H阌qs_wn޹-iCxl4N%IT%l'0ĢE3MqTrd7z0ǃ@$SB#r-Z#@$z&bzg=uK~uizHo*پG\H(T䤰J!%$Ck@zeQ(8%Ir"Jؠ}<,^~E_~lou E }04F?"]DC̦h3ZZoF}O\Х:#uT T<Ӝ cWyvڈ&oڭ\?rZ eyw3[8-?N>Sg(lKd}Igqh !ABNSNWv8|/$47V} P`u`ȻpY|3+xgP)bJv+>ЛRe=twQ+Ώ\U&;C{36cpTxENWTFyokW(-(? Fy+ϟIcx|JDLPJi:ևe:w&we 0}q>IP&sJFƆT2bM3juǝ.-:dłzpFV'4J@uجK(PD-Dc=fv@V p~4y ᮀF³g??H8nog /h|G'`G4'q܉\Eg_܍ZЮ֝Qq:m$!"6KV g-ߕ+IҨx!;^E\.ʘ{wJћgkAZ|Xis~ὯTJgkk_hAsTmlf.4;s 1JM({8 Y˽VtL\:_\x*9,f Մ,j3īs^hD 02.wmq̔J:Xz{e`z:۠e͙׮np%ݝeCWʓ5 Ma? +3Zu*ϝlGc(@Aua;hxh片Ԕk&s5D;,#ܟw뜉N[Ι)Ryt$'.R%R$-Z.14[)_㓪(G ݢN7>pnqW$`⧖\%K=!&LҍF3VKp j * #@0Z,Wڲ}2YTG4SZ9pI`BY󁳼Hoɇak`<'ťd`oTA҇*&!BX`m$=Sq&M=%5Q,H{G"Բřk1UֽTթυmb Beȵ*9AC^k 9N1 ;E:Tr_MRZ4j9* LZF )]p c{I LFc0ÍUU'72&Qk@t+צpbfkg0ឈf8ח %i2-}E ҆V6Sä$2 xBaC| 8Sہ2ZW5hщR[x8x{ >„{z!hϼz`>?YH&3g]H\$-m;pd#<"Kq1*Lb#Cno*qJʈR(j͑XA)IgxU Uc'g#L1ލ红w7s <+^8: @ȗ}zPK-c;bOQb)IX>TUf J} #7m; YfBk,tiZ%D:2D{(}#(.Xh>cV+d1U60<9]` k 8uT|T,G;3iG $%k<Ī6֕x+mk.QTcmʤ7)3fе h%Vџ1Ǧ_6yY`0X^c(JƬA5̹ڢ(P \rºd (lh 0BDJ}c[O[TuL@r抩˔ lhBi>8 4vөv˥tr\D/@B]^ꦈ0g Jٴ7! cѶjb4VH,@:|UTUR: xDw.U[c t#fVmi.ӓ-0&lJ80)ЙVNecX"oƭ 2I_~N .;f %E5n`5FY,w{#(`~Eo A!NU!B]Rl-s{SQvJ"K&Q$F^,.z֞jEQ_U0VrBDbLc`#"@S[2,pZxZEЄ@<=a>֭bD\|1ʣ8I`RTRXpD9cy}2L fouXNәU[|;nHD\{KDBKvZ}j1pLqΠ&j+lyԡ(ЃcF>@iԴPyE(}R(T)V~cmLm( AJt+Sx[Q -ʤ"Yԏ/X{Wrx*ƝfE <-`9%^VB0NF7Vx`4n5d*,hb⮆XM'PG]r؍s@ QMހwK }Y]T'e5&a,C>j4]ƪڜףbFP4<kFV-5* ΃;(@K"mR82Ggm]ƢΨI`&Ybm4 ؟m2w1`DhJc<9ye>F/q %dgZA.w =, f,yՠ6H*gUnKo"{ oT[ 3$]$yxt*2 cmaHu+m0Zki FiW~YJ~lri$!7Kd0umMW-d0L&Sas`*4,E2 պ[Cm58EM*5ks8n ś !u>fTs\@7^NDU.*CXfԘ 9$bUtD 0IJK1:Fff9MpIDaŒct\N&J#:KckT[]Xd3;^m#W:ދg֙ :o bK]nژS7o~geY|{B񖺪x*5.gQ{aԮ?xY哩ׁrE%pw@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)>!%nɛijZo/ۛWovR|~fWMs@~j;Ag>zd9~bbf:Mr5/&)K,'a2-?O\Oڥ_\uHr/ןw%?EKFR91o5fX}{J%=Ԃ~2>UTv~"GI닓,Z;Pk#Ta}Tji~0E?ggelnTM@G.~7G鶸ԖU89c fX  W wU4j{L'3M!l~ĩP>mؗ^6a<ň,SeftmG9y(ܸm9.$bCb˪ru@(^qeA$}6흪=|\Cśy;;:/ 'Sge^YyvPKV68LO*o}5Y ˖ݯBOzXӋwan{?>=qC Z+:}i\{Ӿ>9ooiL6#n}~I徛|-'y3GO}(Wq-8?]i#l_^ 7O~ͷ_Nο ,Jn~W}Q:~柿^ŷ%\qt͗vpވڧ Uuɫ-7U.7=_Жɗ a5I<}צ&5=riuLNɬN ^CVsЬXØk+͟f=1mZrx1i=wlgn*6QlRx.[CorԆ;vCm SC6 ^}y䩋p٦WsXI66+#W(=F\qS)bJa+hbg.ctqJ˪t Zhi$7|۾ʎevI.Zz%V(2m#\-?fky|D[wۚ-%/?󽒇xُd gNk+-./ n3v/n!zA[^2SBC^Of5'Gă渶#jtbYl35cLlΐ֡nMlYV>V[["l f!@!O[Ü6u7n|Z[@`#v'iz2Avy&ē8zq)aXkGa[f; ԃ<%ir /[E{ w:KYG4A bփf=;a8z}]oR|ΘPJAh#+4a#&zhϘMb  Sڝ+{Gr ?/k5W2-l6H")y-MajbڐeId$$ƄTLM3)`[VRoOiڴtudۓWr૟O&_^.ҏ_"j(N=|}_ܾm#Mר|]gU09dQ |\wR9er;z'ۼ򈇇n\'͏jO'N]4cAqqoO|%"3^͡h|{r^/@z3+[l 3}vaΗOa8;g.~k/petv:[=gy*0|}Gv4V 6_K*fΗjڗU8mKe/fW*bi7]{o9*Nn-9v`vN0Y4jَ0-Yղ,ӑtXv7Eɪb1:Tl%o4oVWVYdxڣ%͒HiA$ óy,$_JߍNa|3mͿ_FMRi_]&T2t\ o zx ՓS?WxuYFåUgm5sT+Fj}pyՙEɖX:nf~dpkfC5􂨝s<큋VGQ30b\ͬކW_V)\wm[ U?V&:ezw߿ka}\wj#NUCktj CȂ vms7ke-M.t2a^!X&HSE57V"F 3ꄷsvMsLl̮wltӛ0G1a$ow W6.e6g2li[xxW ˒nSԦ,dY*%IW[˶A`Ybhق^;.2|e`8 iQ(JW]x,qS]Uf<.ټX`D\>~~RTNYz3bƳwguognIr<sukgtzH vUVؖ=70`2|hKB+ yWS'@Q~$H=:%v*! KzArX+.W[N7c_ě5脢D*pw^ײ01]mf>j؅΋"^̠fTԲ_ͰQԩ0ftY|w#|0`a؆aOߖqE<,m*pdGS* U(Pn v"L(BO?7U]u}Fl-> H hPR仫[ g"$ ϵ79Q T#>-_& k0*Ζ⯶Çda~5[ ^X?.%sd9(I#%FR8べaIXn0Pcv&4tHֶ owٳv;v+AOwVH5su1 U+2G˜J!+ QXʅ 36YQH.eB fQcf0ԜR5,3FΞΓoPRځ}:Ao|t5 (Jf(V+'N O|`T8(9Zp)80X UR01D !03V:d v*P `QQ4`Y˙eFHw=²4~N:a}kԾ6zl(-ϲ 7a⌑+,waD-b[笠T:= Mr[oF{9xIqvXM"8P7UZXU+"e]lw]x#Le4}e Wi iq)KYv\'gͰUѲso/۷  yQc?N-G")'Xrx iLX 1̄2A$װN0}6k(R1qQ~vo0Nu\ռ7֢_K*~r H%!6y%"M*cp[n P:T D^#FH|jٽb TtJk&g!" T)5u`G\"7>bg2DOLS&A &$nȝb΂wDðq$bFYخ;v6EV{Tj%vW"x[ ?2Š }޸1X $aH48)X,b4HN46˅b](S!E<O3/}z:UOy c@p"D[\ CudXR Q!#Yglyh׍%8U#18^#51\u6XSnP8 s0i40cg <9º-:dv_8$0QE-}@(DfK9x']7)±V9ͩ(`p!cG>_̫#69\O9FT9BiW$)CL& 8%;Leh[|Ak-̞<Puq1_s7~kdeL.n&ЄYyp)`8uUжܦL$* uvZ,;POŏ4_0AHt[X#0i*;6ŵ+,5nFm`^ߧ z?|ne}H\] WjΦH,xw$7j@ ,)hOi~ݓ x[^ l՟Mh &8]ސޣ!в .V&vk_wU^ʷczzQ^!15dJΔ=y;c*'1 UERu9]]<>l >;}fd2-POh+0~Mw{)jBYu~RWpώMs4 `+КeERڡظz2Lj 2In?Q o[c/U&Hhha UkƷc)w\lW'!$ߞPc G߀9hhD3ae̹hMv#9CMA`}'Rm/},WV?RNm/¬@"e6^6FsF\P t* 1  s x6sVN ZCòZ+ Q bbZ&3DV&Ίϡ^Obc67 nxn 3?g܉ӌ09DV"*F!rt1I-H4Y@HO{tɻBo_]Uod<߾7m v(F{mo^8U..k|Zk?j˳xL:+ǥM֦&c>IM-'\BOlǕҖ\,9I ^i /#ǒI'aIȲ%˫U GDB-rp8Zph'9qw;tUZ6Vgo[M >|N|XmE RӤdd,وE<d3Ymaxn:Qq5̀"*j~V K@0B*i-pT"lWnceL%} K*Qf椊98bU:7VVfa(d'd(tٔD AC2%1Ďs̢jb8]GTvjue{#ح{>M h|dgQɏ"%%:h'd#k] 6>8C&dȁXth$dQ 6jd>MVkI~Y XjqE4--bobծds5RҪH#c:gl&Ġ 2Urƴ!Q.ke2 r3!h5hc$&-Ho8D"T#UujcuVӒClmo{5=T$:XLۢߪH9{#<&B4]| x,vd+jlwծ;_6|}FF=G?LdjdL'#\Z˯V!#(לB4eA\TpM,0"}4cyɿPCyQzp!N:tbm]KO](g1JSit ;N %HHs:g d1 Y99u؇&˿^&{2OZѤi@xu5/_ǓtWd/s=_*n ;tE۽&=CQf'eCrq܀ǍJ_KUqe5lOPw Oxe8N+`l9FQӍG_Vࣉ[<Ѳ~Љs YhI>3$慙LAb!YFkPVeןhyCLyHL 2(z[%GgPXoE wAsQh HEX7U2W m7-ly2Ңi|v:Umbr`-&I%GDɒ%5./d#Ē3*`P.f0>MXAFǔ;i:̉(oS*WH-^MN 7OOc,!)V2.Kxk 33$K2ߗdO}|U|?~sհ\2Ϳen?$'dvqj]΋gNKN%2P2B1.832/|AFGY][o5CY9}d GFIȌŵ|Yފ>р qwEp(snFO0 `Z{4֐FFwO׏wzu8Ք+aXI r\?n /k3֊Rx JFVw}lTY^|]at1a.8g˱jl9HH?^.ڲx3#jAgKo颫ٌ&bQ%5*Wd<UËbKH:[l{NvW#-۴:/Gp=GJÊY,ʃ >'|0Yj[?>Vʷ/?~7NΗ1\\W°,EEi_Mf-(߸z]Jb[W5i_]q8=I۟7߽w߿yDž}^{wcv)}yR?B;MoXu>MC^|v].wZǮ/RkB,aÀ@ߛ}&xO'M7ΓrF@ Ni&K-D0!ys;ƥigcr:1GÕ !l$1KQZYD8E-f]Se_縂wjėsqrӛݩ# ݑ'q'Ҽ].7 %F?.H[{23.Ud2ƮҀ7N*˯'Z4gcRӝ%_P4h(Jb6Ky~P{}crDJ8/aS[4(;@e=oε#I眤b7 ,2ŗY[E:)Oĸ0$!)>t/^}Zں&Әoݝ 5Φm}K&c[;?mDy^|w_.V/VOmW7aѵ^7 J^aԷ'hqcFow/,9]55-5C tI$A6 ?fw nҵ&3P9pCTQ=죂w_FU[}c㲶7lC8d#U} e"M3ffduv.uLncO?sd͎FȮ_eF+{+'#1R<ųnr Y?^t#7ӏ3n'i{=;r>J+#zL!eD2\3gq竽ϾZo  h/ :4VxӪsǧ?^xNtW844bAm|415=]y#½p*]Cŷ/#7N'%ӆƷWkV2qu`{ ;Mw:::ݚ}GԴ!o,]>%Mį~9h 0Z}?7~x*6zmx7:ܗq%l,;Hݶںd)e e)nb^ \Yyw\薉y4a^1Þ}s{揙)KC073ln;F9Wgu9L$ ItMVV8-rY;Y.$Ld:t{x֮6p~W[|??Z\I!Fګ=6^䘬VLv$kJYm v?)5f/х2EB;rsUjEߋRy?Gf:[ZiwwحBJqsZ[KtfBWAdlSk0QӤB3j]^e9Eg&E6:r9yb I5K[[jVn›m0YZl&&MjB1'Zj3:ɌaJ9 o춸ѹk3SD h g6Ѩ{} Չ&:f%Uz!lBM;JJ1T03&cw!0t׍N{'l~%B!Id0?„{z] (y`~N_ǩMT*V{ڵG^%cNօM,i6B!b|jn붟 iVE8USI{k%hH%tG,F[t!(sRB#L11KwSt-J I25mRhі8:BO ҋHSFVh4vCDg͋Et,CD ɞ5)0sS҈5V ꤱ҅Y]!(dGoOM 6deGގ0H%GxH v_U95>#<ņӤi!:38o=npy^KQEn"棊1!D("_ .ìr [M>o:tS2:CҕjI#JU%H|tLECNZƁY _8gP4D&,9d^U0P>xmBV58 e8my4/!!dU@d% e@ UfI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI/R3/ (s s8:I X,@_" d`I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI/dk=˽wY~^|{4O6ƆMc`/k )7eh|8{}-k=I.7zqrW!xf;Mrht*8tERŷhgv7/}jjoN;SOgcMcVM Hʚ " } -Ɣ, 'm5! Sns{pzy^\PfT} 8!$D7GOxzdlO\.+Ϲ'BsӭC"bo:Ǿyn/r^aOFVK)ydq dQѸ4*Jzvj޸辸c\c1'ڵUߩaZ)Q^k,kOk+^$e빎=J`uGIR)HҬ)=c=Ó0wOZ^:|oZo8e}`ݧÉ(yu^E1#Վ)G}-u[둫ǿiL?ok'&SѬ޽ù~ fGu~iqu gpRGɏs_._-o^]?w~lϭC?ޡ&'>[?:Xa=X?O||<~&><_V8~ٵ7@ڨATZі?Uu5ز swiM4 鬎1x=1[@͵ӲS難%^WGpHp~~<:$ϑ}8>u\merrq\\>^"hU[.9RM*=lNc%\ޡ_O˓ĴeoGK&=nWW.|LV ||Ko]|ݪ So\~rm7o۞ٻ6$Ww_ ,r%9,aTWw[L$!);aU_D[[g]3SU<5Uoa~ L۞8Kzqv:Im^/E>}hfh_^4!f9l ^.toUh*2?vt9R5Q+hs(TajzqC%/]h濌pReK<ԼQٛ "Zp;0ܷqHkt6 'gs:%0JӲ1 NERLSmurJ,z@wAʺ6G0x-"}3ɐϙr6U:ӫt/eq"`@Op{ʧSYEùfJ" c҃)ތlAR쾎?? Yz9XޖEyp~8 ҩٌ\,؇AEkZFh$&p`Ŕl[gl@d-$hގO~#Y3~3j|- dC&a=pain[|mi!۷"jm+fL31n>^x8 hRck'T.j<[ݣn?boȣ&GK%w}粎z;oBm6&ӝ Z2ώm4 c9|:rv8{t;Gal3:Yj|wGq~}{[cta2^/]A.aCթ[vK4מb@4m]nU=Ecuɰ4 ZZzq+;_pjqVz;rMX{q׻pٞ7hۮG1΀;8f=QM{QU+ҶY/F%i[S297"/}D\8Ph|cq;=J,DY9:-lN c.ֱ)_vIXM5!@Kؔ6ݞaۜe񖞻>wu[Y" "QhL&&*Ve?(6Z(mNf<COX=cܒG8䑓dOX-q$-/%LHy>;eVMOؑYfۨRI[qzSKxx'=10P9ˍ2"x>iZmX%.N3y9+uQ\IX1 "rV`S4H\+j{jJ5_XM3N/Խ/|T_8-ȸ#28xd ׳8l{l#x$Cp\zaXhJ7ت[:݆B &CR)EʆVh36 > \D*{ja<\v58k^^G4jʨd! H$ u!nkDzЖ 7 1sc6081dB\ekbbhD GQu$cu8aKWoJc<XM?NGG=^ iW }0)%*\hA9+562ƝeB  SUgXlo}6hτЩdQK"1i6H=b5q{ķWu 8[UݜS}ez}V)KW[)gp$86J&o~1jq?t=> XIJ%ُ){ُ^w:1yT0|ֳ̖ˮ Y!{Icԍmn\F)bwg>^ny&:DP)ʡl0z4I#,dAQ$(@EouHA{.9C1ƉEl 90DZ3 FX.,d$Jdi>A_7:wƶB]ϝIm0r\2Zk 2N&) D41&c#E >'.60Xgi~;9JTM'΍={ [%^]']ODv?N6L%'+-J, sXUM"o{),L/u, żJ]Q& \c0(tX0%m+eadN>^/bM7T ~N7~}XF[ZtXK/} dc2{8ssM-OD$ҫ $١-4[ dibi~$*ޠq*;4zg9d XUpHi)zT9eȊXH|fGRd"4cu9-PثqtnG1D2#, @R'ΑUĻ'*dX 7tJ@M0!Y,M hJ%Ut으x<~ѳ97#U.kkұRșV6R&֤>O };#%|/`9-Jr__ENC .!;_B\aXģ8 7l!TJTCK8\x|e_pW/?s'S"|AH7<55ͷZ:uԪU/o11_<^ q!N6E_;̑ăxzympL\rF@ R!4H0!y8KZot9gõ;cB%FlȤ$MYXDp,{h$ylTys^;tO9Q9qwmXէ1<ː? G\fGEn @pNR9;dCmfGl-jn4՚uԮchդ>nƣ HҏYۦDMUlJ{m/zx{,xQCjder!j)Daɥ$%Ynݘ, GV}ycԧ~&Vtf)(:V tz_j5c>/teG198/ A$ML(in¥ PZz)^iժL I48$%lޘP32NC9l;qg`x^}>{eY>^[8%wrˎ+rd3SJgo8B$3,hd>*|x[uI9[^hs{`Kd2xgxZY>Ȳ8Cfh8-o9qRcJ1MB2ZD!d-q*[DEn>0 ۗ̓gI $fjKke9&@60NXLu[q\dfPv ]yoɕ* cTǫ cd&#`vbuJ\Q$MI٤Hh)RafxU:-@oR(J_:7~)5^UUQ0h?^UΧY+(BͿz7?0,~σv_zM//H^,^~}?G_F~/f(x܎㦎G_9.XJyʻ;@Q^c+Y;Ăol@&YNNc=+NWr-~^u):N?gEu^=?pe&CQ:"q_Լ_XEo*scv|dh2e:\d x"Sa>5 X.@#F*J]D':(H$TL je㣌 fPZN> w:#o8'u C;uN/Nk#/$fFRΔqMܸT,h-~Ӯḁmz-{.o":{= =V޲x P6$41 t`.gCK)Is(7U!.ArI!jI9%2 o%-Q@Erz縊5l5A|OP 2 9 ".ɰl*h!!Gyo:ϝDŽԎ8파 uR$2͸ԅAr&X9Ӊ МV/Fûb4Oj˨5hkz*^_,d^ -D9%jIC%X"KM5-1ᵤMC+pf@WvSAPirnvX.I s)&@ nN=PNyI9ED`9Tm46`d+K/5zTf\*wWzIȢ9[bМ8R';L8HQ\ tMώPͥh3뵰k-#UD鉤9Ht)QnC?\2vJY,W3iEBTrL*0/TgIh :UN9g gK>;.j*>>3E%măc^HcDA@iy!hC(`U.ʜ!Hڱ@,pT[y,&0eA^M@R?EUP !"n-5A$ NwP5'Nobޒ=|5K:M4t'`X(IP&sJFƆT2bMTqkV%{լJε^U\&ŠV 9-~YKH_}ܬf~Nkrr[m=¼ߨ:輨_\(Wwhuss|E?2"+o|Qqueex!%qk)]U꛶~7?ëؕ-Yד{ao^=ིZ<;~nk6`6ݷ;.(ڥ<7m iq {՘9"ɐFxrsebI(s{fVDt= еREY-LUV]3y㨊76R=!Tސ|DU. HmR Dxe <%PL]^./g<J"D`J8NH RJ`jg%Xo׻ Aznl $ r% UxZĜS,ĖZmm8wˢ.$ n&T9C''t 4t|$)rxm'j22Eޖp$*i K0%`K+*;0ÙpLC#XrQh$8L&jI-Ox!ę(bTǚd !gi.P c%q牖4$O-ֆEʩ\~rr>x/<. Y (l U誃U5j>^LA}|g;I( ?i/!"an|$ R(sŖy|U+Ctgl?=THɬw^.#whm}6V,<+q֮wkM)c.]LK2}%&gDD؋Dȴ|h;")J\> ;Fhg!Wp1 KA@$aCRʉH!y 7E dNΤ Q娀)x q BHY;1O"i9Ж[o>L=y~{K/C̩HEV\;䮎)IPcG?~+ҏ'( ~IpѦ^$HHyqқ^#?<ז<q8 .a{>Ȣqo8cfYU8N6Kqr>f9JPn:M]JKjbu(th_nA);=!?N&֟fb&аtո8daݠS~TP^;U[-'٪?f5%|+ǫ]"ŋ!-L^ iWXxSGmxSW&mbM SbSR~bɥ쒹}zjp)_۪q֟Ô UѼ)tKtxӳ&/W8,4 ? ?r~.~R}&<4n)/P:;aaa*u{Bcj.[FԘ  ;\n`4]6`}4lX j ^wmvK5%]]dSo޼ic>'54g߮x;-y*RLp)DqP|v7ScOm|UJĥ<j隬g xl03` zʒJO6Rݤ)J7fLVwJxd4;fnGe <ƽj4.7^qԦ-ƒUp^C)HK⣃%Ӛ`+u$ vң`] -\f+>/)g3k?(6DKp8kq IJ9kUJTLin7z~t fc[ra;` {+<|uKrrۯ+5hy:Ŗ9F@<4A>%X.)%ҁc${;u]8*>" ^AWfٞgpjƼPrU~ lU9_ y< gd aIa+&d8dp; 88c |_MQjOƮo6 ~PCA-f5B.wogAD?i1_b<-T]m@ضV s?u_Cvq_{U-1yY9d|"+P%Z3+"JF&&P])B*j4 <0i.|gL"%ܕ`, &M#T; [8BlN)ǡyiX5?xQŴɻ9V {OoTD/; ? F;1]n6ZXtQ&*N&$7DfQ2!KÙ_Og}Hśsx臈LMi| TzNy,of쉊߷W^]*1Sin)z~1Jm.[ʽ@t(t6^&Hz[߅/x.&:1Nqrc/px|ޕqd׿BSI۵ 8dFcTD9{\D6G`х HzK[usTyږnh_yJ n:[[1%OV{d6>vwz{ŗmzJl:9kYW%q)h=_ FdRtpד~?)ݪJИB*7ew!xT>t\l VOw ^k 2ֆsPU,QdKpFD] 0MQr9C1~ZLm)Y7)o3R,\mXSZ@ avI()wFJIE[G<Մnxh1Z#M1ZrV>gfJd͍[@@TR40vFWZ%cM!:F2*T̐?bB1hd缤21{WoT#FSH~~݆gYW(#.DO,ڧ\!"^T,o PJw?C4'!T {_7MМUU oZFSfm5ljx"7.(]cp0cwFf=I5b'-ؓ9f8:AϘkc6!nTY@-_XKEU3A!6F5K!RXoݸTEcOkj_VIC2j(UÆVܜ-mJ"7Gkt:Zi>͡>ўutV@N0Q9_X> h쒱ёO|:͚R(- n4:!x<4ˈ#:CIٰjD QG*!V/}"XLypl(K"gȺj5`ą%dCwdunhJ}G]=V V,*ߝZ4VnztF)!P܅6н[ r FPԺd,bX؁yB݂U#\ "U*VN Ȥʆ4j}'I5EB=QAQi+ t`V(ӫl˱d4EcѮmO H+Y;F̀j@o]?w cÏĶo]D 1kb^c#~4!wTLc*u &np`RQCK@T:\A:MC{Ꙭ@ ,N) J8 9rt x!BUgT@NRP6Zo@!Nc̮|A+| 좱"1UQ@IEQug~Bet0+=W#|AE8P֬il,@'@Hļ=(aL! ~∽}% xfA}뗳u[kf .USW%4' nj1DUJ=5jBB%Q}2yaW)wD ^a}zd~pvԙw5͈ U c{(ui@.0W%sJٷEAYe Ìc(Ohv=!,j,(tA\xO (mxX]*H&zZ#t^SeR,zGpKR֭A^bF%(!L_1MNtk+O<o 1t`uaU$ 8Ձ5;::`V/$ ƚ`1t'x g]D]eb:\9)6@GxIYh2y~tC.-Ax* 7DR&.RaF @ @L)}X1z. mM!g0Z.Ya a>qVH 26ӉB?Pj^!]uf11MV( `f53ÁڠRk4[oMuԽ"Z&B[v@B}T-h5.u.Y9nm0e_ +Wvӗ0^L6-Яjgf?,* 36{jfѳq1eYfuuLk昴5g5rFl n0 |&c/ʷ^fMj:䰮T9P.Ϡ ڛ(Ht&%\!mEAu0 )dj*EϠhHyx@oݮdq5ա*4O6 4uSjCxr7p/Y>"o Q\1L*8FJ'd$U#1r PuMcQ-0 `*6p `N\76xk+2(ZFj|ҁc Rת$_63T;U Tk>⒃^Yo5kPqab2`jo9A-4.!P6J fxX@Ҫ ֚w#CY&@-\1 z}!5?-(hA/r|dNҫrOY F(0[XZ0끆6󛉷 LCecUc6\E*ҘD$b9R (h2#)!6/.>jk5#[`4jM3]O|w"B@EPJ+eH0@5z~6ί-7dP@!kRjS.LkS͍g*.FA ߧ'[eUCe"Wl>MuPӞ=TtTd0J t"_V JP 3H@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D *dH ɄQs&[_@)'JQ dPQH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(>_%%!)`pP1oa&sT9%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QU4z:%㥵1sT!QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(D $J QH@%(>#%G뵲}>Ə͇}㮽sGG@ϪYrtZ9m3gG)G7>hijni㴝P1ث׽<?鷆S=1;J廷oƳrΩ(9{FJE78]S4T4 K`㳼/ɨNRn&*;jx=rr4Ii|\ypCj*'18!g#O`Ϫi>{(Bxdmc=#V<ǽ򏢴u]T]|#R6ntwWw/;y׋Q›u8?[rsS qorn)Tf?p|'o_˻?yz8~RT^~LNyNާ;[}M:޽Klis L y\e ^Kѿ%{ylsH6*&!jA"ŃYSdzϾdbgs:#p/7`¡=f݁p͝qYၕ;J9D r(t?ZpҘGb ֠ o2~owW}l|c2_#^v[3ӷnlpyl⅗xg{s47'5ySRș v[l{6, / R݇},O6I`$,SB4I)vU5)6XT1[$|UK-YX~\ )y Hcw^_Z"ܕ5T#B-T\hʄ:TC !=AA=泐eKK,Q_?Q=$Ň=P"\eꉎTf`1Qxfy*? 'p$2[|10h@UiNlxA#TO|92ŸԜa@4,Uk2?ؠ G@* QLqcf"Z<'I0(>Ȓf8xD9]ք V(8kTB`$iB$SS2f>K,D(oNޢ aspxz\n8wj`/;q˥1tp 1zXLJgpvf( sE: ހQ(l18Ƙ$<~\+ ʏW*_g vpOqgX-/`.oqp_ )>Z>9h_ gd1/qbWKҤ5P=Ad~Oä {zKj-/0LØSJ5Ū>fi_&UP qFD Z>h O?M:ɿ̖;^ٯlx͞s9̛\'%t|#9TqgQZ&iW,‚6â>(lWRrꀗƼ]Vܜ##˺7$Zue^\pQEBxū#XNv;(۷+MN5vwWHכ`$ljȃO.~l';uy1e Ҽ3pb>҂_G7/)lUC*zMvp3=_NŹz>SӹiuӘT®YN=߼X5pԿƜfWB8n[ޏ4`lu'޹ !%Ta"9`JBg$e1BBjT" 2 Ra lFh؉/1ش3HH")c'2a G4RYTJ-Y؇cEA+,UGSL7惝q#nyIl/e x(?UYӎ`7nc8?]o8`CpcFs 2hPLr%Q>h</J g{BxQl05zC“p mkv)mw'6RL1 #dBfQ²,MIF4(J )W<`?om#E4!mJ!W|(VᴛPCR3W$"q,2DD,$hL㒧*BQPH/QL7Cvjب]*ViVigUd7Utc+T\Nҙzb/0䄗'PWVgu"ewg]8KkE 1?ךo/^Fc"!C'akVPbP]bcY΄;Dw*]MF&ZS8Z8BPk$B!,B YQ(2$V N$$l Jߗfjl,A>rxA\s,+`*Jf<-pk˚p3_F7e  DLA<ԍbsN T]! Bt$^TKC2B Id])IBQ9+̩ZFJܾ TS%/;ïN O/n ,QjɚC~7LV2W}4ɘ::1}I*Σrl2kfC h>؋u8&LWhŁG9^BYLS3NeTT@m\=ZqC;fxte€^\o> O袛p²6&\NEątѳr%5$ېEU^-,S`0pfjo"ՙhے9^;; /fn3qX.SĔt(Id듹.&s=+żu1+x_:ǀ1+xxWnx ƻ'H9NcI3 7&+7u2q2ܑU+b9CvObuË ԁm4{S^pjrs$O$I(M=RǂJH%H)/.DؚGז]I8ϐH="I2+CxX b9^EDnKeg+N"KevdN#*^be+i|!K9T!둌 cGdJLdS\}y3g:θp ޷u13x ʲ1kK{cu9_Y;-A vN"pWnIF}1 )[;V-Lx1 L.Xl˩ݛC}&J/كYXd~McA6V|" Yg7ι'.0m3edrn\ pKOT卵?29 h}iYq rYɊWQ'W"zU*E!}k)Yڥ0.҇FՕQ0Q22[H)W,GwˬJb=|ك. /)5oEX&㻉OA[)R:y(2t#ysDRF,H2qsuۨN7M0%Bgk&V+Ld#?Stg{.jEgZ4ԙ}}XX d'S=|HF{Y릧GqQǷL|;;慏qY$O ^-Y#> NJjZ]֍N$Id36c-n^x ZXuosv3xyfU3;&.=ҔGn[לxI3&s#˹M=2_S8SYY{YKt(Wy)O0|Eg"8^+@ku(;,@kJbY8FMgS֧>pC}))R?I/ Pcuz-njN[= F|)H}u(A^lv;lDZbN>Xp}ـkn-PKs57r|l\8X@}#3k.lv_RZ&Z}D#|AsvtJӔp\ !^#f53O RVNaD RTqt0o @4I?XcPS&ett煒bΠ,tgu ůmP  & ۣ}s5"(KiB~bVfOk .mtftb/=73 !^'b1T\!z⠵5G;G v)7Z60`4م2>j,NtJAK|"v`SK3J:!ͦdOa=Tguzr3OMmGHDQrUf H]%dRC c'ŁC-'^,5=,D3a@ހ(A^O>cޔi\+n @0벾xr Xqa&1#yG6iA&O/"'J2R4^,$rKPDWэ;a 8sxD')3:E_kn(9 Aۚ,`M5U)*6[YP3Vx%Ҥ7E%qƲ 'sY_?w e,^ɛP&s'ajd&cŽ5m|᧷\ykrf #GCw< 5oXT5dd< H?gBq?.=[ X]7ʓ\lDKǿ#rqk=!DiUF Dh&Owe͍#(ժ?{av %^"w'Hbu11-QjT%/2$Tz@DzT$}IOnW3LZM^1''] B`|&RfT_9z. M1i9`xWz?c Jè6LD^ËH rW{OhIx]g^O5~|k _a<[C0T|!቎} Xq>8Hv9Y-*exrgo }; X*fI6}V$Bh'mPE@(@ig BNCзda% K~dE4MKf!Eߕwnx A9Dʩ,w6e0ݭ~ҾKX"\򩕺VՕ^H&d6] {xhZ?es*-F@ܟ\?fIFoF_ߧ$Ebhv洷iBĀRXehJ*>E_ΖYMϿc S>A%\DTQB Tb"Dmn21%Q"D2|-gT@6@-ȡ:K]7uERU@!_[+A  1CO}N0L1*d7:poE%J߿Cr Kbx\xͺmK5juI|Cӈ( U24\@Md$"Hd >fGzڕj2ZX*rYxڳ'g{ĥ>Tv 5@5חws`,L_ҶD%'š!{ihag@lRE"ΏOqiG:ZC~,<*̀<(nWrUleb)FNc6ŭ5i ABӡs!A(B:l%\>t| T/*88T0~inj}ֱuQKJd~jY==6|zp8TO yg32⾌ѤLg?eKǁt v NȪ]'d3T?dHtO{WQ̭/oTtҲjR=e3J?UUg՛mN2 TS&{Df0 W\NfLC彟ڲ yNcMG^9D12DΝ|6QIC\:쯁X4Aeޤ67??u*~b[J rҩ6*N;}yE0ISθ>V) dg!SB֦`늛̕8.\AV{Ssۦ]X\%`oҏyyKVA;XBLz 'c>\"D [Q%?q[X(y,4m*Z ?Mӥ6;p 2$NRKjU1 Bl6Uoޑ$ӇŠvrshn+4*a"I׆7n5Y0 x9sG|1H]m2 ӻ{`ldy/yl I.(t6w@OVU2"v˾PTX)S9l0?ߍ`񻰚*FOv"XHPf,bvcJiJi}ʬ(Wvw$^0uݱ!\,w&~ RiKL:e,ţB;R>Zn 3䜆JgjBBR0aO2z咋цd!x-\, 2=Mz j$TtYB0A3Y1GIv[2VJk{V`w+R q(CлQ2D/Vo!)@Dd gAQpx$2Sb bM|=-1g<)`|`JtُM; ; r(̽.&9zz4LPk@joP}Z.ln)PnOuXPs*sw/8tJL^*&NB"Ѳđpi!,mJ]7B! M ylz]`U$cQVsHc}Aʽ(Bլv=hgLKԣ,VF @I 0_Z ¡S~uppᬌ<ӹz[vTR',&˨[:9lQ#z)֭:FxINs?K+rBǫvb%0PrFbV-qcW5TvxoQb1|Q?i9 *έ'Z$Vbż'F)ꎰi3`#4JXAIݻP| EsM =^c9!Nj^P!7M39-M^%Nzih֍ Koa%g+MҜCk1z#RҪ@;pYná=x naQؗ1ri̡o;77~h杧(F$9tdB"|1_w[A*24r P0=tQD^%8(dyg;oqyAexcC/,OЅW@Qa|Fne&z3Xfa٬Y2"5a~3.e)Dh<+}[V>Y>'IC?MH!Ԑpp58tڐ=URd ) H!R(*,@1^>CnG>J*=\n!=;4Xܠ9\& O!vwG2i(pf ^n$׮o3bpsȑrsS6FŤ<6>8F}dװG JG!#5\kp5ސRx(F麡o׷o'BAhIol6˅ ʼ.e9*D!~9빟c$^~_ku{1_'V(0%=r-~:/Ɇp%vSS/(DSWQ-l$"Z"hMyDO^y0*z34^3^mv*8 0e5ϗΗ^E5IV,FbބRlO2-B| yŴϪ8@]#DR^X^BHBee> >ZHeĠ5<;֬b M܁)r);%-([D0іCġƌQB;@Ɲ1~?o^ݍb" ?WHn]|2mSB80=W ;nSQo"uY0& BLʔ1 12 gܜ*KP‘^4$,9 0g"Y儿Ur F4I?3g;tei%FF^,Dˢ?9fc:3c)rC\f&"4ح,Г0z[Xn`dPIC>O'-@c_ QrB4`4Kll:Bc`:WUb2pSAu@lHw鰺7V c 4o /Iԡi @Ƒ*diLXD,{YnO^FBRmY(O)$^iIi"/v"6$)E8@Y+Jr#_uqU1 clLg,J@(Fs &POED>t<XZX+* NM0)KeRrʋ,1LV?ZRAbA>cV|RV)N|!' R@e .}LG9hcTSGYTX`Up4;8WA ނ0LDE:I4R \f$"䧸m8ųf,LjL0HUa29ݨjV)1ʰAYtfru6O5oG+,ջHs:Kף߷<ҫ+~י^lbT dɨ(沣 H@UbYSJ1#n~ɿCR`Ir[S? pTo᭾5+q6.Vc|z}">G؈4grs2_nv^Խbj^{?_#M~IWL9'^~f4)f߬zK M˅C^FCF#^rR%J DJ(r2V[! q<)TrEE ÈfS!7 'Y< 8Y ΐ[FUQj\. "3kM<"%S'*~d uq@XhUEA >D9CFJ/MR:O];OF5w1 ^Zd>[#F=1][d: do"*9ɼY%zy(±[(.C=}i'uiQ@ _)cI*F( N<>OEJ|EWqIDa!*3K6*kЙ1hhȻ3;åSY8yzO2t|IF#CVtZ/: I~onһyޜ̌4G>Nd,൦I%.>JI4d#2S,#PI3"'{!} Nٌ ,@ L 2\'ryD"9 cƉ/tU<$9⥪ h2E `.\]rgu2$`F!fHY)%_ (R׆lu|rޮug[E7t, К\13VaTkr !{J,_(+ a,ƔOd[@0FO:,1}QB*pQUuBPDŽKŜV{?No11.ڜ}ɓAorbڰ|~J5Hzq#3v. BZK= wB w:ޓΜu0Fa 83"}RJwYfU8۰8w{Z{/dp3ej{%Y~SS1zVFJgg_,d6ŷV%isjWޜH6g镆c!w/=xl* 5d`auRi,\2qY+ NY9ѝfDq֪͒||w_EXVPnQ}=u)Gkʶ|)g+!jgښ[Ǒ_QSvpe!9yHR}LRڲ%Y8,Kԅ(Dsjǣ@G7tBgRjx$ i%j'i࢈˲e$ZD;Pnc:){ʼn6e7x9%/yN,! "عoD C2.Uka !y6]}Fp+#ћ~<磢1GLUK`yk0p 楀N!3) 5w^HXqҦ},)Z TՠzQʂF\0/{&Sj#4́.y~: 7dH,oc%ʹƪNm<|J )#x&2%yE:-R C6m;mI6/؛KmQl6[ ۤuWDj(JG+c/G!$C,gR]𥍵2}sسR/#a>a`\جjkAao o R*zUTu&Ǜ%n,t 4)l<*ZK,M>'%K W?OQ [YE I>L]2%x?Q"#(& }9҇D0'1s2x|pe+SZo >A(+AX|%VX̙ϮTN[JOhu`kR&ekK÷.0dX$}7湄4|aU`3띖%(\ǜSs} Q 7J׬9X60RJpp;@r^ VxdK1cLALͱt*HZn icuGr9wwtD~U:hb~Jpc*vnsș/ŝ`I:x~Fa_ih8^On2]۟qOMo?==2ZZ#%|B&&M2hkUe8čR@` TΡPNhE1Z1[ J:Q`-4Y33Zk\©릃k'*3paRaw)K+Q:fCaM1K`&Hk~~,&%P:p,H1Af콥@ú*.4Z0c`aQD\]qN渭fQѤr>C!pNRM$~99;n1A>Xknu"4R2ht^q "6sON5&^ :{QtZ`5 _;ՎOr'!5ѦN`~bN+$dAsB3x0 +xK-ӨdC|_>=G:/$; K4u֯wpeAz^Q)%ը8px]g^ȁȜ3V(p<XYk_i*IDQXd#L=x XZQcToTFˣ)o mnKS6?Jz%\ El6}-3Es\R eDҦ8DQz:L) %Po1&J+$_ 0-LmNBHC T&K鵐*h nAX TMiwۿ+IBi}:gJIYۘ͡XJq}Uy#']DR'I3ID\n?iaxXhw.ZK*+qc@iXnD]7e)aِ p1&6Meι*Oy~67 ʡn2)LV!G`*$B@IÄ45H9InaiceќF Mo)Pu5WakƆ;`']Iu [Y"S:?;X+= RLwHj0Ds/+U9MUSm:;!zf8P+f=ׯ~bbkܣl|7~r^k~ ޢt t4^G131_0F r/_<|Q0=hMO_{y|3,+bF[j=Q{>ko%t4͟]ûs6m: "ok-t}ux90/c{{w=eo`tVa 7XYaQSlH$py~y8$\mY ErΖ_B+F겲kHMЕ E1[ l㪈 AHMz, 6 }&QbC$$u^6^3Si K.E:qlH?\ 2fOų-#8\Px4&c:g>ǩS:v ]GK:OxJ& (!=4;@"i&DӢ;%tSs\T? {yʿBWc0Qq6 H0ͧ<R{z%h3Uw5Q(t%!trwp5M,rȄu[dsA )=I"l d2^7aK0V"Cu kT3m $ 姢S-UoӅNXt=BZIZTp;%\(5bLl@J&,?W)$cB b"cp $gV0Pp2ÜY4w"bڮ3s:JhqACޕUFHBsvINJK yH+x-x 0X/Nq2&pB5h & yk?|+-N L3T1}=}Kd󋺈jXJa6H~>&^~\W%RVpؗ Ap8Bxz2%@nu傕ӏY OI\D],'ۤggaHD%==nc:6eVA|5OЈoBgA+EB)Qc+_uSK37~|?Q)_,q ab?>Fڛ1zocXwt2F^.aMfJ,4BRJaR~6(1d=ז2Q<7;zxϧc/w u`YFd4 KTnh :Ws_wFourYբ̽{߶F=w˵.(tM ǞQjȨ6}>Aee>÷+@%/X`g])ݢ?T5ɾ RG>=C ?=qlRo=' ?PuvS:"{oSjA-eİi3}vʪ=Ug5ƪp1D#\&ZKBG Irba4}c=w&7tOg,^P LC E8 wf_PCQ!6n`Q҅!a hbSJIZ(=/s %4zs3ogVTX2kNQ+F|pQg׳0ls2d!QYUHqUFdLYLQgļ[W y+"%8ƣ{A]H3HJq)}{kaaQf`dB+%Ō1p6Ҫ:27Scč5oX%sθJT([&`zQ -2VkކvhĥGW-r`L7(KI Ɂ͡[dcd%؋L03x0?/^6}lx/@}ȯ}-貿' "1%xbz8wDBilUQCpK$9bI1CXw ?x@[UjHR$db̥&WDFF|q F^~]gE2hܸF)3>I"\uOdUQ Euҧ1%шEdY]Tl`!1\&# Ĭgp3Pyq!>Ac%m<:}دds#S)u$H  D OA gRH&$Zö7% x{ ŸiNo-( ]~yu X6p`H)oH_`y2`=]N}H{ɿV2[ԟB#xl蒻۠,UŢiVpi/!X& D&HDgU Fs)2r\s|5Sznn兿WN &WH=%,xlRHFD!~kKxHNq^pN,܁U#歹o\8JuU_niK!͖O!Ĥ-;+7YْIoyyR0CB~+*0LTTX! %Hb)E j ˧ly&|lW\Nͷah'tjK#M< 8f8 bY@'ZQ2iN{@<.mRN1u9*oDUZFm?Fuf==@cuAz9!]_5qn\wS庁L5|>n+ ;A1_Vɲ@9CĥQe8eM*6`b=~Y1#a '&[E-,[G5Eʖ.1T4 Z o.qr~Vӷb\1mj |$mE e2OwdSb/^Gm $D|4MRh"c4K0e97_ez*l9)qMT\6SVRU)6d늌#tmB]7B7Eڜ3jz@'I jWX5P5℁eog)RH&$Zݒs/Pq̾#EaC}~4 oQDgHrAIN8±2g7&HU CU84L:@*x8C)$š>^αr+EQ6؛D(?Et{Da: fE#՚4] T֊Ak7fVl(}1-*po&ЎM נiV0(PN1{jS׫ujֹ[sux[4jAbrOm (˳oVv*(|7GgtE^:zzPeDIpjb e r@-7qHeBRRL?tjrs@ nɴ058)X&.ƿ k'wnBY/~Nz6Q6$χ)?Ɠ7 &Aa`̗e?.gϿ?4Wz0a+phKΙ4_L[:H/ɂ@?<0>ؖEzx:}p0? B}<]şp 2"e5g3@klOS]ɷ-;p#IDȞƣ̿EH Ad)a1qQC ͊P f9|W?ɖש: GVbdRE2`Ŭu9Yl*ͅO/}~-=.aX?~O0z5Ng+<)qT"TSDJz7O;G?/8`=xFz?f}oc~M}YEuf-mm߹։ 8ء4 ;2)=ᱎHFXzas]G hA}Uo5aC* 0KTxC@FYhAq1c‰)y͍nj=f|HͶU!L+vN:ъ6m-垀xl+d> 8bǭFA̭S\񗷼(].w ǐ]cN4rXǃ1L1$]詮Q[9{RnXU +sԉޯU)Q`L( bN(f AÖ}0PQd RbnG=OTHmU`ыƊ9ƴ/U;]o_ɨiyV"Vn$J_$Į59 t;ưW}0 +.5 Ml۬5,ؓʊX;q4h'J O+On}hA59oSn@ r F!ɮ 哧d\g Jk1ۚFPcSHabWkFiL!̶|Ӝ(o)pKh(%K )y6W3-eo)}3xMt}6 q^>i^<.8c6JB*lroZ,Sm&.bBg*fq .)=OqZP Rm;\h)i3d##PCc5%[w[{F7S:v sסXXհIl{TQQq.ȱfY ZQT1K(a(!IUEȞZ"%ǃJs4_u\lQ0ӑbOs xjk/=˰Vƥ _[ƘjMwÁiͤ8iK\(Ii?8N0hpZ ;J0Q&lQi>SZC-*͇61ƌnL.o8Xa̪ ]톃k1R˻\SȮZOo))#ݗQkn!kXm}۝m!B}CK:Fh[JSR% Lzen7= Q斀cwLe2a8ZM=၎6Yma% EK/ٳ*6t9%3_x:<?sYo9OM я|_`8Z|^A4?d0n<m28#HHin ↥(бA$ R<𛍈d T gcH^ع%x#`=uY-CRٷ/!]kSͥBJ=4n1v(7)2ea MBI< ʐ9'3޼y+NCJJJ #~[]5;jۏ5?JZ澗z,E5a(J;LU(MD1 c"JmeK)hM.9/eBc0_-9_YGҲЭ|u^snؖTV``_|siVcv#յm$YI.-~2`4'߽SέI.Sn DxadkvU;KFXLʼnnG#umH*Z$q d G#M֜\,9M8U)J\Tytu(g5"yZzwNQK a]E]*$征GwQ-L2O*mo'5f`0ywQ} Nxf FqJz!Ri"!FPpL}!@/O ^]-0hgg-i).k,UpcxK]ªW;FHQVB+sxcžJPVYe( {3a7?מּ؂?B谱6&E\P@҄Xk0g6.ш60V i^Wt B5Z.9ĂF,4Jx !ۦ#p*QR/QXILx[|ӂH}pMb>NfdL*(v)H>"YpwѥY]pIZwƪ#斵x_)3 Y/~N5g% bq6!,9Lgl4Yr_|t*s/9g7ҷLM(g`DE\T^{ҧfs,]={ZI>NcMMLP32X`BI)A ,=sBSBǡR!Gq*l&AGfFrF[&SQDٻFW~l  `4<*uQ%% wG)1yTbQWeUdƕ_<}Y42% y\: &s:R, &) pZ& d:C /ǯ";k,ZC7T A~~3vH1+@dǘ҂Ry+iAOa$y`鶍FrnG+IҨ %6YA0.Gd/kxlRs<|Z4mZ;.9;8+f%Ehuq .QڀDP6MN|]#8qYۜx(iVȲjZ//Sh g*Ɛ (%"CeV$EXb()'qrvN)jmD (lQB}sTK*"z~z6Wr܌q]Ik êH՚x4>)㥌3b@NF/v^O&kbTI"FDq *j)(!U9*w7! xsܟY}|K6xxޒ^`YiܜsmQ:JFx7V xkr搄8ZO8+ׇ$!sC̬YD񃙅5$!$&tɶ4=J:m9%, &i,lհs{W&u6pcQ}f:5z M8Ziu#\Цj BK?rf[0Vk Bŭ̎m >|`T2~L ^f2Jھe0j|4H@UH>)_ ŵ]/"xq×MNRo0,iAP&YP0QJ5?cɰEl:m."gB.wjsCp^HAj0v7|V:H_&@G.vj M؆(C?Dݷ5m$_$#_PT9iY`=﷜vqT EJaBíġe_w-.Cvˁ(ko?U|9gܕLrf(^yk*L[0[Io.e3kB-DtAqmgE5]kʴ8K$& \܅("%Ƴ0VcR&k:@;EzP1ց7):֔\+RpwE8UV{>m{1 J]: Z0|en`< W햡fT*}wU vצ>b|gd{ΙoR'}*ŐEE|ѧ h\H3Pd`,Wwx7q-}{#ߕ"oͥ,_@8wޞ[sv[|{%PeT~+|cƲ`_\ է(\d=3Εѽ|rܘt@#_IDR(Jj06 x_6D2|Z N]5jcCzb*2Xf0 _O͟Iui,P4ٛ4i}owu?>מf^{kOK 1>Ig&H8H͂UYP* +Y;h`.jyG~gt0dy~]ds76;Tf m<hxy|Z}\m#n?NJ$H\B6O*cze`95mG-PY'Ow㒂 p} :iw N?%,v|k W%#8"*0*% `͡"5;%Ҩuļ41lue8Pq8,  QfE%D ^ UUV|뜮@swFm;5N3i&8%HDS GJI I$}IQUaN\S A}7U2t .Ahq`3WaԠma)UEJ ЊoSB/1V.U5]ʶT`(20.B kQ|3G{grxmAq͔bT0L?D+S [O+֤qfȉtFݽ,u@".4;CJV1ŴR*TL 2W1 fsJ*A-C1PLSl= Й\eye v-g"E{%`ۭ[b G L\GUV4jk[_` wr>yb-T* "0bM>Ak]+ШGhvAtbB&?nSHi׉t=*ڞKkcM-sB@mP g$؎;Z͢f\[ˢo4v =:3U-7[+k4b_;,ZBTg3s"$I FZ xViC&^b0!b8Gnud&g˲\ GGs:l(AeǣT)muZfjVEI]^x e[6Pv OJ%npܭ߬dC* ȋϟ>uTeŶCA=Ő򿯩DXOJzлx6q-h1FwiSao_.9Cm&=QAt1m2%$ۻkw%%i@{;->Ȏހ갻@a >F7W+?}jwU$1&RD@SR.q}x'rs7)Vs94&c}_^5k]2"L޻+j4WAD|*:ť&,&y(f2C1 phk0jZ;uWn6@Tv79nO-FiIBD",c3ي9ޅzMcm\:*EtsQ\#T~|[~(N 5pS1b*MB:#{NP 9Lt~MمpfP9kjXQ_,~GQFSfr革I;HɒS8[(?ųelfAIV-n."x~w3 rL.n.E<ɿ|:<ʛ'\'\0یorhA[; ȸ7oU!hnh6 (b[Xhm?zȒ#G2XJ!;LUѭuyH8Xʪ̴DuNm>[ K_6isFWh㐿 /BENSoyi1#g"9i#c* "VFJ\SDg][o\7+vM"ˀI6X`}Eĉ"˺8N߷NZvKl)u%է)2f}А8`pGf§E"vi,`b3OöLU$h6%Evʔ]XRCա V[c+w>h;F~9oXiDZX@siη0Uȭt a0)w$m=;jB-Uۘ+R횭m@@,AS&/+w>h;F~ hَhB$]ӣL~<͓}BrVdEru\$l 0:H9ӬEGf5٥,tC1"Ch1U!c]Ώ>h,:2` wDD`|AEwE0s D1Q_W<\ كsn'5y|iw;hAcv/$ԤMn頚%;K MZna/)s)l>96n֞Lڜl^S7-*am;>hG~c.h;F~YȴG?=7?s2h1:bSnF F1\6mo~RG0XQC>/HR3H$2q2d8 q14B_2<<9ЎM؆3v}PJ;v9/].3a;ȷh?:Q~eg,Ea[n[{Kۿt]ZFeɲ(}鷣;߃A)=謒KEuksD㫿ZuYON9Zie_R̮W(tO-mċyC X, dץ3ESMD4T !F$?޻(bvhhF[rf\{MmBIw 6:~ZK!0:,*9UekLe^X< V׺glkP(`2ʧ?K3!T_, Mi \5s,R5ytX,*{d&v^aU;X)vbOݮX"9Z+ o]43~~5׾}.ȶ"ٛob`|`KגJh>`68+/6y-k=j/b-~~ʬvaw]w7CDf5yx/E̲Cȷ`fwb9ؼ:h\mOpSuV$c[Iƶfl"GsG#DcH{L]iˬW{7 OD3]h BI!d^moD]eDemcj_Ӳc)Aܣti봤\xFRA_:& hlX[eM7=v;#P3C((=ټZZ鰱G ߖ{i?>2TԽFjT6*CS?M',ŝx3 9] sBlc&%k\sZ'$.+2$Irhn0zAjlR]< ,@V bsU7 1zw[ڣX;NNj,X~u |<ʎR$;JIvfG!DCj<ԉV^!\g;K}M!֋^t5u [^$;QV'-fK/=cs48[S$R.jJz붸b:0d5Zm `2@,֤7_632P2.=n\ɸ*)PRR@IIL"GkdFO+(ޟm:aĿ,ZS¯V-\z%.ͳcg,Kq6W䟚 iױF 隙Qm,,Z4HC_ ]݅ %CSb2=,-eGaSD6 1R+'Kʓ}=˨Ĭǟa3X;xVY_IQ^;I(wF} 4+ġ 5uMjx;+j\jLcq%|%]l邑u'H KxJ۠`AYce,"XPACOc4q.\{AuQ%@~#y@?r> V^ccvzy Fh< $k%ߎ#0ăl~3QNT t|o;&4 z&q&V`0Cl@d E)І>> [X1QX()?$KltRz4Y\2>6s{)%W `tk^b+y$[K,X1܃kFgqmd8}{I?E[{-9+uYܒ9q{۠A0:?R4Ѥ!3(uFx'@Q=@a.uKo}~Q'ỉWc#T Av++܊m>fޔ!h1Ma-8-`{.*g P<@e8@e1dYcv' 9vv|(~ߧj(xgGF J (%A麄LT*qc3*Q62`߫Xx@z"#e6iXf_WBk%zz6aIgR`_*;) ?>U8ǚl8cDW#[Nq=yz xq??^$_V]^2x]J˫_z!,w1Ezu&\ɟ_Y23e%6eCֺoS?;?=i[$GG|,.O $QxpѳyEɛF q7oc j3#y,$>IB|Hz-,W*> UN2OZz!m n#ytWezM- +S60{gwFݘ9dxueMm"liQMg\\8di[`.TS\I}hVFStN$#*I5ͤ"T*B!:RJ}H[O欐k'W|1 ԁgEEg5Lzb^ɷn9',[_X" ='Ŝ28rV>>}twK`l_qJ(CȢe `rNkk<:KEztXR2zB2.7nJi XK}А/igybϓ>o4@z7Ж hlϖZAD 2"I [9'jSKl=l;ȷqګ?mk.'Rܖ=qڐq[r9#ÿn}{q:9oo6Xd ׬(ojN휷 ;>WATQ=]se}['1b!:,?-rW8 SV?c#E^O{;vFr%2Ga-k7ӵ?O~dF;UNO'] II゙2Oxg>QKo-zfƔBU0EwO>>`bhRjBHJk۔82U,ͪG_NsܢPJ4-I-e SkZґ1l&}ĚX R}3^e/#N,+8ii,1AZO9c%TLfŧwWp C+qs_K EqԖ;@O}l̨W} xs 2ĺRC:&O*\C+-"[ХnZk:r#R"P}4`uI]o ul{gx孆mtW~ooV;;STFy ofv"mjojkK5/sی-z7Y 5^W.N Bϛ ; ت=XLI^4ji;[&OtAUh*i!}[bfxXeP!gQAO%5E-XFmI(dX{S Z_xzvf]VWz7/HFnפbZa  -s'euRƉAv٘$]`Ζ4Zd(YOXo#FPpKqZ,TAD5#2!€΍eIJ@^Dۚг[w3-2%B($2{pU FkkUFӊWPq1qy(BpE%W^0C1@NV(2!e?='k^q6e<_H( 7vq3 >6>?ތ4[w+bQ&?/yio{rAhV}9\գT/f+>zF裧m FB)@ }:_6%Abc,sJCV'9gϐm+9U@ߨ7V7GJg,e[*[0,昀C!5#.s=IIq6QDPMjq;nka\j;3е,R@o7:b&5NڪgYv_YhWqF۷?eYX߳SYJ(EHUAU/Pk y?֟K5Ehd.YEЕa+2I~Cz>a0[Ͱ0dOڷ;uzS`R*l򚞆U0B0}O%+^rK: ?Yp/LӤWu2n[~\Œh*zA" Ɛ#5akwޗR=bJ<:´ټe޶E`nM] hyʰKAJY QѢK'A]KuXs"I7OPO2i^srvT2kFc&fZ0vTr}㞲o*a1h!@3ͳ]]lvt:W ҈:cC:JrF ؐ.6 ?v*Dːm!,X- BSa{c3U)P0(10;[|=Nh@wS L3ս? qpGڛk!-+;|(Ϲ{:xFm*m n*k2F{U_z`a;e3Yn~T&8:NGQt\BkMD4tGmR%[BT*PLq8G\1I^{}qְ<;냟G@Cm2A]/\޷~9;8.AS˿0 g4 _\ ܴD)+9-cl_!ֹKb2#Gii~t_>_+0R"ԍV:{i#Wk7^W%VyR U)'3$ vVAffD䩲˴iճYpLAUN+c?mu"!9;i_ UO7^fn krݞYtHkot L׼ad7ݜҁT`h[҄qdum>Z4 /U@|qn3ylPNfZ;bkLרUXT (IiG\X~ V!ٶشAJF,n!1yqJhG0R-2U40z.(#֠7@IL2U8s1B.E3&X9,z:t*)R駷gk&bY-l CLDV؊o I&Z-E5AK2LȮ5qf3q^Oy1D8&;?7u?QYZ` (թiggy4*z_(@9w%q$GWh!G8xM$X@,$_?/@U(d $6@" ݣVd:u!Bӥٌ1.y[_׍~01t)ӻz}LdS3ʋ** `|p`{ :9聲HMZM6E_C`Kq06Kcf--'CuվTlnb`z4)ٹVn+߿V.Qv*3C\`$V¡1q15`w-GJ%fmk14 69K!;bXK[)o(KIޯN GLdtK[IuP.3Uy0538,6 &AAi [7lܳwÆqf7kԂ1"'FF`ˁ~c w rkbfp+Dn?{gsOsh@ v?aX#H}Oçr2Xkygا(6Hq”,^}[._}ƹ\D+[rϯ@UGh rpK…s6Ư()K٥dP]goUB}WrM[KpI=/zјy/< tSP\;ZGewz63aIC]ɟV^&9i])-\¶l &ITlXzJxq6:ǮPG IAզ|yVۆ"rG E&NM%d)}C N.Hf0ʻhvB<;lg b̵K=;+ֹciluOAO7Yrp)?$rrpvXY_() X$])V@KT&6v>ɡYRwԅ.9"[SBZgkW<厎 ~)C4|>A EݝG;҆b5Ϧd 1ڊaA5Gu$Ri·^/f~}[|EV|0z ڏ ɩk4;3RRW*[&|'UUcHSjzWBƄ`{6lcV5)wj:K4]4{ I."=5P9eQWnfSt[۩s5ja<VǓޒު ό8dHoQ"V 4 T"l%ɖnŤMrUR^<]+p~S͚#mbr7H޽j'2c}L:mSQi"t TYl'x~bLn"LZ9uM(~đ߹WR/%)eϪy(GU96ul.&}[&DC7$RPnUbʴT %>tjr!⣋RRoZm8[1k}nLщP'Р뻓L[Z^ -w_$x*$VRu]Ra9v0c,#M nN"YN9Cg~LEۑL0NĞ}&lJhi b?0fpWG'y5Ra+՝˹KN5:.hlVþi[K9;1%J֤e`c@O j|^6:jܭp68P7k61ibH:'ElA-f 6 Vt#j;0t5nnTP?` 6,aB\rdװ:-0U;@ ˯0s cVVD=T[,1z w4霤d o Vz R1} 2 DZ>g_mxF. 'iW+d6>֚b@*0W.3\OdI|"⡦}N%^q>T54vЦ}?,&ZBo&pUzg4;T\i euûoV֘b-Ò6,1ө #} H)=&v @v߱âi1ʔ'o%qN3vBdRHT{VTZ%$aQJSs^XHQfٷ tɈ9iS#%a17lb͛ԑ-U뜦mO?vT| w_Ͽ `"gD i$^NcIm-fˑOV4|Y"s)(V~ 8I0e⭍]1kO4=:k1cba4ɉkbtcٮ}i4p|u ES)PT_aƻR TiՌy]1?ؔLi!Zc,U 'tXRK;3d蒡 V3|;9wY;|K:V;g| @$AR9U11Vh@^>>qA-9톧ѻOE@jqtz83||<_jݽ^t5`s .zu/?~4̥^߻EUd [UcB&ܕ}_ȝt.}%RB9;"r3M !E}{8ڈx*%Y | T3=p(&am'YMSݎsҕO,^'V}SjS {#f'cԏᅵ !Ӈ_Ƙh@_D(E<w/oAqp?(~,5s"˒rR\=ثrKK:ADZr[ǫW_H8+߮0`RlU*[]^f rE6uD_}̧{kS1W-z*n;Ml|E 哫jLTmֵ:ggg6>E.9Xo ̆x͚gol=o7ao[ 9q619bglci-֩vir7nY 4{$Ƨ^Saэݽ1[?<)^+XqEr#XsSB,7.ܻ(pkz3#uh|۷.r##Ǵ;[@p>h+֝}Τ|e?yY)6?JyJ X/+ݗ} D۴l0M]i7k0J,MPF6˽z,{ Mi̸RiJ^n<Jy{e=thy#s&s?ih3o"p F]y>scR]Y1ɩafN1~+<~b7kFcH%χkeOHX͵k'.VS9TvaqF܆3eiq}zb6h`D7|ǘs9];>xqv|qZ KtK0X\TM R}t_[>>CR5n >80%+i+ Gi&b8ѷ>p]U.Kte-̗rŠ([{ #H@v=4D>׷?xO @p!~9-{58%ia DjL>whO_yϋLh7~7G^&{4h!2[nQfE# &[oށC;]FWNu4:>u'n1 ST !/Fn"#FZd~)"Ю hB '{N+j7|j|޸GvVYIMhD.3XNE߅ 2v-7=whGkvpԎUL݄v>vVCsKednY݇M\Fr>'>V<;TGLM?AЯ|Z(P#f[jm]qЙSu>`DtP'&wlw Y6}yJH|m4OQOQϟ>lwUV,E1M%Us c8$(Kd,SdF^:YlM] cXFՇ",Ļ?h{D71ooǐ򇈱P5O~>Zx)jR:Dk4էRQɀ8,C{a+6ŜSl(^'!U s`jy#Xxl?Y^:STrSLUdՓ?%On^WR(,v\RŅ@ȳ5=lծlio_4kS[Q@Rj/b+4Q"+yIZ+g+oŹ\*e5ڬ6k<h^Jq0~*yFrLGJoh7=@5 g&oxjdc%1GT'\wD[^GVU[6${ Fk1ݵ+xCj8t8 rpȻͫ)OPgųťQOx\qw1lX.h߾qwT߄xvc{ 12ވk$JJ6DW`c{j|$&,:Gx5ҫ&=Np¬'1(i*]#:T/؉2wsR$ny0WDg$UkG`8/RwB_M,hIג\XfXᓢB=?:%rNOfgt_~}uVsVӄR``[0`c!`:ZBxQJ"d Txk5sb,5 Dj}bf gVהƞ\cqoQsk^c \(~}.g]oiSØFbOVXc +&c\2ehg)Уg/i1>PsvJg %`bmwR(,@ZrbkdߪPb?P-d¿(P~__u#%Y*ֳ-!*0%hz"=@@It甤xVJ"sX&f}dr& Jx 2D:1U]{Ss@jPFh!V%2Ph$~7aXMբ9'?og#8woѫ֢YYZ9)BǤE Z*ҝXwsN,jd[d \Q 1(%!J0_1%@S_(,0–L@r&WG3HIIn ԏU'fPu+Z]PbPxciHE$=@!5gjFuvaLcYηx߻G> ԤeUD?k; 5-6t|Rx~z*<Ԍ55VP>N1W PIRkY?: X=:Q.ҌF2%kX9dq!C^<г1ܣ(e_,HjX::\KraNr"w -O8] \y9ĂBxRc+ٕ k k,*B㢊g7QbpGFҠg/|;LclѡCj3*(:)j6rk;kƉvPbRjc '"i Y&F?Nj>/7)+ ͯ'MN=yyp0AˇIpnywIqH V%q1l(L]ߑ*Xo_tY\š38.E7xQҦG5 Mb_T C) DLM*J\Qt@k֘R+\k 6 `OˉZqXPМd旆/;w-lx~h6NY{ i"MH`kF7\Gu( jn[GU ~Źj:x9^ceG?{fl3+= $Ak-tex<;߲t=l?9%X`z+?HraNo"3?̽SA n _$8aRT>.tD{ڣY 齸ۮh_ y1UhqBplxKoV+":;zzXf+yYs҇%ƠN45?0+kuyࠞm҇In|Tg<DI4_~ݾ]|P'9?^?\.jFBo_?^9)~u%kj~Oͩs-!2;[XhkB0L}F`$G.*dm lR el6v49uy^1*!2nwLzq)FO5z9"|קj߇kwmqK {rwWU_ A`fw9Uݢc'j.#Ci?$hDvWuUuuU"ʶ$vP|a^Ks EuSF4*<RILϪYu8e{zN)G̔=P{`tvOAi)?:l'4lkbk5O7U'~$[0ېf,JNv"Rg @(c}C"!|V+o}u#9_6ܩ )3Hn4aQTW XUqA}JN`6⛱*0gS\%*_ɢ󤴬\s}xR BXN@aNV9pbY`F&j]39@vTճ=vgli<VZR {Ҫ~>%ub:ثnuBܵdA2|\ܣeBq ,ڐܣf|.G;s1i $e~= js o^Kƹ?|}ϲOYʬ!uHC~*Ձ@HKpF$pF P\)<!P FZhjDgAUGjh߀4O2VZ lb;j:)0Xە>}Dւ n"̎ȱ ̮ sPТkÜ&B Á51RoèTb0r|Bv+[0Uӡ p&:T(FVPH5a$ QOX_GӪ %߾{w0:Ly$J]r=99_ou}r%=`{ت7l JC'xl0x&_Rfi|6GcSp8GV#V,FSjˈùc#Y7n Y[8i6rƼn ϫߦ'WV8XG͚pY74G'5qBzѲM>%5J!4Jem8?[lK_4E!bvuàg\WNk =c~$+];7+Xl(QwYVhGfupUX'N T"BWdNW$zuEl芴^QbMzVKc/@j7|͢d֟QRu2Tޞ/V%,`z3\U\ɲ/'</Gi5(5XY]k}5iLgEP k(idKBLB9v;?)yXEWmްVܯ7sƸ]]K~=9mx!J.&yNI"m_ݲd'}jyP9]؍rڧ"k)!ȗ'E! (sAwrJE+ޱzD4{t%^8]kz i<,xExv|g~TޜF8&,Jq7D]n]jlCİWEl!)T[yDl;I؅ΥmKb˩&(k,v<ÏHgS'nhAwv`Ǡ~jD,joxSܕ/?: ZF$D G "+~N)\]~%Q}8)Afa 2iYtQl3Y!Es5m%wUd&m D0(8g-f0iHc3hAؿ rwQo΢0[@PB㠉:t!L$RhZZjy"^|XG+Qҟgm7o,sqżRå:=J^07g\2-Yynw[}v}rݭw9 d1UdiطIbط!؂{dO?y-|b^"J}Q i G}k)5i|o;KM?O׶~@MrB<_k,Rٖ.BhbOCllv~N@9z ʡoa+٭ӷP[{-, JaC#4 礖uX`ΐ!pMVfnL{Z-5nߏʟD. [+Z15n~:D4vtBd>m5vBj,;~[t[b-ief~ߟ=~{gOngWL௳ů77{)Tg';˷ZO_UކL4W*uW6ThY!h=ɠu'5Z\-#V_Y(u v~VF@<-)g,bYOV6H6L6:}4?79 ,G֪!IMe(0Y$h.huNn\I`=:mST^*RYܸ7n\o4TaJy4iw.ltQ CHFǧ%t܂GK/n>V?\'?׹OO _B>)a!/א743_]͏N٘ms 1*fclOLg.Xió8Z6AM짵wjtsw'C4Nb f2HG=4]Pm&\HOZzR]Zܵ]_ h)*>EWc"m>4L)Q:jN1JO)é3'Zٲlt\\(+LaJĮP8.bP"ZvR ڼntʣ^I׌I/Vi/nOi]HS-|RjJJ9a(m@g•ppǨD+V8*bڈt~VZ[crG,zW[#5cXsHRۤT˪>{- Xr<#"Gvr ϸ,[jJr* #v> :^pL`lO[gc/=1~62*0vӬSB7O`& 5o[q2eVŒ]4IQd$=o?,Pɻ:P%bHcu%c6CiH6I g2Y.ӂ˔{||wжͯ> w-7L= ,Uɕ5;.VgͫaVN48&REV0Ǘ@F"P:},8B&` uE+R J]}\,Es"s.:"E!R"E!RTmh޲H]E"d*(MI:`Ҙ%{áIAY6?Y< t:Еv7C>ˋ[g/*I։LV=aL:nd!WM[Y+>U{s$^"I'̶T`Mk :ItA˶RJD//Rh<_t5X^*iLgEP>ebP`hEF!9{Lj(98$ye+#1ZXN 1ꢠt0rLΆ@&wV2ϓY ByrSx<>ڨ埜بJ7,[b߶4_tNJ嗗o./EN |f+Y~l4<Oe@/;óYd<*uqޞ?f~s/^\vsny-Xo(b,'%>reKךrkzSG w-(3^;PYHL @:3o]YoG+_RR#t%q&a=\UlcKN 0;Tk. c0_Ww_9KO*OfOوNMB r6kpp`@tEv:%=ϸd랡 -9a`Lִ$ JVn֣V+n۩UVCyⅷSlTt{7N[3q.qOZۆLVYav Gr~̆+ TR$ lda< %ys;އ8r񦝒Op͊Rf({v-.)R|2>gS`p`T!M(A)Jg)l&(Dm@j@@dX[(8+Uv}ᲅ@WzN 4|:F_tH?)icݢk AW;D/a Jz;j9LxǶ!-r$ϻ _;W΅*DgP}0ſ=?ԫSl!\b*Wpzg <U_)0~>ۜ+P`}e/{t5DG)8 IwT1Œ~6R`${ý PDR]jew5LKfD)5Rb1X>}&by,!nb1Ye?hoPwmPݭunCf5~<%ǽgyfe :{Ey;xãվ*њ]*)P1ͻ-E/0M))C0r4Y i+^|%8**+;,:0c4~gt$櫫I%z II1OR|arx百~pr[?GEdV]8>5)w?gŰhuC1 I8'8TY˾z)OYy:&Ɣؖǀ|G[X ~&3&@z )*€*s(^*å2HiRJ/l#eO")|j`\z`aRq!@sqO%jq>4 7o+xsY7/Rt5 1V{oŁ4u)!Jyh)VӎϹ3|z_ \6p&6#oa9W6{="˓&ZҩŝIIo;~zwbĞ.&{pN_j,^}5$1C,W|v;|>AJKkY9e#JR!Ìa@*ZqOJ!x {S*D+0<`5m}aꙠJNKD?G,7+9<} -J5j(g$JU(DU`b+\$%7%luP2Aq>J|( gZUn6;ir9gӦ{7kϖ"crsuZ7={Tqa1Z45{FΥ껶OO6RϹAGSTi"9l棞@.F'" aܣ[J nԉ!kC\Q 5VIM?/Wv?ٳos]&/+8= K`$:`,=<6wXn4/8^iWS4zfVԮ6L<7)mw )ҫzaj,,PV'bl"C%ND`8Jx ɫzmq4bw I>0ouo/=t3Rb2M^ qh \Rc/6@ ;JW}\mVv?l,<)3J'nIΘmU ݋@`lc(8GyAJMrV*sY>FIJM$2B]5 sI>˻ G46Ҕ O1Mܩ &BO%-SNRSW`{{_n{vzip>Uܭ %ISLmrBKv6~O"ꨑ1 b &,K5.~L^.x@SݩDA& =W1S8VM^A;۳4|w߀w]h~4P7ܐڨ2a 7.b G8Iq6k:L0R-Y֖*, # kGggWɷ=|$c35;fZgix-\)'ͤ P|Mh. :eJm~3SļKl2pǦV3&7ԯ{_&ڰ~d[s{@Vҗ'7`ovx^#e⏚H hpDr3!bc\0;II;G%y4&(X /y6*?6{,"a>ڱ F/*eSX$o2Y C+hA' T spm G|tv4ezsiv[t`{ךkl\o+V )S`׹_48l._VpK0{XSuj8Y~Շo TSox類yƖr{Su_T7dX, Bq)r1Q`;bR1,$#uK~'9T6!_r fga3ٰ2i}$(Y`_\^ef} V)^hQtDN_)^HJFA۲;Hjşs#b-E<Mh`jǟ&u[ܸm0D[k*~yRm#l|ʗ8Yy c%~7*!@IłV(\%@)%۳9r|"d6 9hIH<i{2Whmlޖ;aTfe0n8W;P0cJZڣ &hlJyGZvݚmPSrLj>2ʅ@Zɷ<~?6eLHI1G|hlһ8m)o=PmM9Bh1Yh)91?cH-c Z&R:7sspBԚ!< /g_9g6|[Q)f>W6${ dl|[&ϼQJj wKc4}hz^c"X# Vq)"2px*r ofJ]*@bGq*ђӋwK_lTs-*w*ڬi 63gΒ`V3zBfl.a GS D `Տ9:>e(ݽf~&\ ϟ}^xp>ћA^]O]7VaO⫟GorRn>1wzv}Cl^R^p1ҎL>Ub2Eͯb[ؑM+J݈X1/߰VJh(:G64whaƊ@Q5@R4߽iPΨ`1h>\ϱR}jHuQăPg㇉&1L;vܡw/.o{[,Cd^d|ٻ7n%Wr* Ů7{:|M`Iv汣$"}-ִ͌Ɩ" XaT9?c߫6,毿/_m5p!pVf 0LTѫbZ5 ,JtТ#s <+i:IInϯdA z *oU(j % ظ"XKY9pZ^ġtA:I)r};(y=aϕtPٟqz}(ipY]w4s}ae塬<YM޳tnP,X-&٨evrz'/Ԙc2HGU]ʲ@XuR15׉ 9kbdJ! 7@7!#EQ;-I}q#sòY,,"8@okD& FYeg]oiDpY(a-̨ jBd ){m* td_ qLYM( TYM8ŚѬ&dEǹvG1GYMx@a0uKCe(j-Um'WQĚ;,O6yQ g:ZHUG+;b"M2z cЉ!a/sw[&AkB}۱5)s B5Hsfo=JhmcBk| >$l Nس^!$dR)1fH@Q4 CSk+)6. y22*&a/STB@$a ѩFr*sR>k5n!!LNks>ư5ΎGjYoܾl]!wk_o?Y`>_$?g|^?jz=n&V|NW=?LS}﹘#Zo9Jz]5R5-{0v<[7P/{{&_j~_Ǩ$(;N*'7[*{z{!RI+e/@JvpavZav[N];|ЀuTC ,P)򄏪of8ݯձ*=3t\ٗ߃xQ~2兗^:>!~4gw~TLƿ|zOy:i]5qQ+=EwFɃr̖T+/7߫-. 0bo''yyc7p~8x-y g:8D cFJDSҋ+è %YrD JĠCKRl[bȩ狴;li:ਤg "튴L}(ʒzg_;,J{TY9b\CUvQKꝃv܍ᘤsQ ]]e e/ʎ>sR1n=@NF_ѕZEmPNl[ hs8FJm N v#B{w>~Obƪ{믿~ xѻoߎ>8?Q޾w5j.Fɇ)K)2򎠼*q3y_yXzZ̃hˀF=H&;EB/8*m*H fE.tQ%G eγhRy Y[< =D}(ºА2u&1hr)DD2@T>A#5x np%V* fa'VԫӲ -;H)kߐZZntQk z5y03څY]w\mh0Cl8I;^Y֗mhpZؐpih95~a8˅ @TW$|e*DQ)i׫BMp12b<|22'Ӎ"zC"t`)TVL%ܣQY\`Tad]):)N~CK_={fAqA5<2WB)L}YݞH;$T0Bsiw:edH":ƞCv܍JD y0urH"6Se$2߰Ik>~My/~ w?DQfӬ-|nW*``vm-h.f۶6c BR`P)÷'w5[?Ȩ(T!O)W~sIgqWQX8P kXQ`Tڧ1u[s_|HF(jxVxʇ,5x7/=}a^L{vLlDqnc+,+ e,d53{9}y}o_~/_:}hoG?+e4j"n1HVW8p"PǺsyݜbjkeKݵDAB)<&MQ>Mc;|6 Ѣ.ig!l&֩"uD!r {d$'.[nI>+[؀Y emN 9"@ ˦ӌ 4'p)ҮߴTv.cvy Զ4jJԮHOji'ҎÕ*FŶhV:,b7o-i`b{FlϿaV(N^6SX.vri:ywl|?-4s ݟGn䫓l|AmB Z3$~wW4i_d7;b;p6ζcQ`g1?-zl)_TA(ڄ`' FӶ|vVOn֪8>uB\;R9o?^x24]~_^qpFG|6&wZbN߳|?m^lА17ZT 5פHB9*: *'iM`)/~\~~O'_l&i4+&6]pYK^ΘFeYad˧ ~XzVjOoهLaOv:me2m ēFG3? _Wn[Kуbg7y>(CD毐˥Ꚁ kЂt]koF+?m SEXoݢXw:7G,)'(Y("!9sfmJ(QI2D 8IL1RE4n(nG[F'`#E̫SV'CكN#JOsW6͇Yy'NWm4Uu1]`Blc AL |}F h}ŒYװwfOI~>T UcL~9#Kjl֋'EOz- p S'E,/aOMu`K9t1JN!2w](kO<Ffd8>[h ɂ#,MҌô^)~3fS8k4W)S Diʛ{\ꤸaWGy3$ _;d5jsԹp"ɴmhfL`~[((WSzTȏ"{)2Ok?ޭb/.wb/no^ kmfX]Ց!Ld#dv aUͣ00,UKO?NK3kZ9x5EPnI5z{=*P fؙى#Pz$S˲LB=y>mBr ^; _HMN8 =!lD!U>%+d&ڭu /T.3 W P=0$S&_'@`& e}C+32d3) &W7&+[*Z@N g>eRhk /ԜŴ6I$0c=ydLl(A+oX'o D1c!,DB&"Fϰ92HT!n/d~BQ}R` /e:7I *ͅa"քvj7>ac1|Ry 1_ʏi"@>"Aq bE15Hh$XX0Tm&ܪxaOW.gl ǗY5 dm^E1#X,*Q'AΨa9<3[.IVx/NٳNb0_z}ڜGVɫit(|ol}z1mtuH%xx-V["mQ7Y0ᙃ`L_ԷqpQ_eWٰgzIw3Atqra&};ko_,JJ*LMת^o_|6eZV E/&Z32ꐽ=~ /0 'S{q\0 |5#= 4z>LM[ן ʾtv=?|;s\AyMRwOm0h?M㬹MRg'^I4]-V-Y4j+uRnP1W+; '\ԨbϬz=GQԸ@P>BaA.[Ddg4M$8L\s"eAfRks_oYNqQ.fzyA&{vh\ m;CNxfoit )xjWN`nV૳UPR{68eܙ: ⢒Ͷj! xhDuVRA](Bcߵ.Aba nnp}p{7ӉnP/(2I-7x@@8'lXze0rLLH~po㪅N!=nCa'SBTt1˲aJ)k^m:7:00i;`R$ 3L);L0^ -D^YV75sl|g]TS:\{?²2<5.Pl*FpyxOOOn)Xozv /T`r]0l @ mmhFv\Yʀ][-GrU{ѻj!o^`4\`=_C&vIL| BX LCq{bJBR6x=Wuo}'_4XT=QBKs|Z{agCX4T̖ek 櫙܁7mPoczG 4Rc{__z|Qy? x &ɴA1zzOD]ʼnRÝ0Q23j >[)k'+GiAġDka+wy[`y| u*v̭Z*"[ZJ9HD$xsTRhZP^d0KЃ1}3Y_~ ncX4A~WBP,c&ABm8[l|\@^Aڛ] x%HP[C eR{VM2pq2fK$դҟ=1}goQ)F؃$yjO=9'Ū 4MRLfQ&&ȣCh9i0K9NVtZQڳ=| a6.0I/W, gW{߬pzqwFm}~}w4Gxؔmvmts܏[h?ZWZZ?\L_0JOЦx>=? x >\{'i> #P~ƣSFBʗOUgi`9?ܕ=d|=ɚ~AFۿaRߤ>)bv_1C2AO=ߐ3aGxQ@Θ@6NgQa#5Ը QB0)zhG#z4^c8~]KZe)o7ɇ_| נƂ=ڢv[KwO:en-^`A`WfnZ^ϲsh3vk *m[[ Bfsx;0D{בIN7qMs{tW(ȉz dUWO@^yw΋`_0N_fx -QKчFfR?N{Ha6+hZz]4ǽc1Z],$ީwsʎ)j7hFu0Ab28]bP;(C['U)մ0C Fi\ k'kU hߙbe0YZvQDeQ1'0VQ̣[&3p%Ek ekԫq#mc!{a  {7 'Ћo*HtnQm,(F9'0uF6 ,g?O9,~tch^:ռ.VnU!qv "FMys:pzrBDR)%pЀ[%xSalP]>̎ kI%DT+r"8`jt.q~~'lĭlpӣ[X|Ӂ˒#ł ?|E 2AH}j+S&Aѷ&5-"t+c9B?[6,#P 6aPc ]F@REÈhL5R z-bb٤<ӳ6b(`$t]?yy?c0q0*5/|SӳadlmIgLg ;r5t^S+޽ʖqVEAA U Hȓ?%pmWZ6; 61b.+UIl٥ rÁgdzH`M Svgx&uX6u;fݷ4hp#PsjN@?{WǑJC/L3zz],dyJ\KĦ,ylV_dVWuI [2/b1QP!!ljc|_͒y괾j\dTCE@hhVxύh 2c`: 6y.dv4 G &~W]=vS}_&_޽+`&-̘:[uwq*Z $t77K'l[ŃƱcE%׽RS-o} f˝puT +i6r}f#٣EkJcc e8BQQπҬ3RmqZl=1HNTe#$jHrD#$A9 r-Yb][vJRLd ,46[>)+r9琰tw-xZk&[4F IV^ov,d C6T X_son=v2\ǁcPi1Δt8o>|T5sgR_ YRt4ɊuLIul>U[7j>&թ 3 >y)ߌRo|MrΚ)J{;v`=Z ԵqqJJy~U}!*MjgPEEQͣm%U5.MΦIh7Y d{(h:Gw5 xh=]BԈy7YZn̹%13mLTa|$lzK=΄uLsp7ChtQ- ZU5>]n³Vb'Ӱ( ˞Nu jѭOC95X1Psw6MDM_{l|SQ[L㭆'[=x$`\,:@(kՉvޞT5t5|. i޶f5t_ig摮h+mY3U﵏jSȜ FV :N `WIWxQil*uS?0$vKXrĒX2v6bKX|;{3{KE5 &' %;LšIaF,Y%s%keS7R7rF0"VdG]tzx߽FG )O~bvƟ'] #ar\K9ų已/y&s;ǩeg:=xBοK{_cdiBå>_ʘVECXX6 8R9c[+t3hmT)Oe4 tt~Xde;ɢ(Dr)[ogsrKiF5֡*=UMAK)^d 욥NrBϰj2K :ȣ% 1Y1>0!lD (+e.AJ"HWJކg5A1[W 7ךEnaVvVT--=P_Âe(52IeY^h̢n+UٌƬi^K7 6ԱVq.E۴4JƨBDe}2ch||T:YBu@K)20I2Uj!DRy<(9gh(plPQjJ'᤯G~W>Js8g^]'*|mJ= TFr*^8ߠ 90.'S?.xYxziI5bD9eYЊI+3A$~2VJx)|[ 3| o5%#_4p-442% Z#(MʼP]!%Wm 8ʥ #Hя,Lt)fU4Y7v2#oКg)SM/KЛ=@`AHr(SQK#]E < ඙]"c#+( s:hxHU^+ J`2.)pb7A.Lօ !zQmzpZٲN DQIL.#5b\7L4+'&.&^nQׯW3Uj?ng>,eٮtpoK{Dhz)N5^-v ~J䌢}H`3D`y! nk`bM w(+Z&; /(#71@ &uuwIJJ{eME1sx TjG2 )!@%N F }, 9uƞtӄ΅/f.,#aRlV+9x'a.7+-TDGkî}H(d(|K_?^-]6>ux儩 %/Dki/01ՖA}ƭ['1z^F.J^i["a6G-܍xM\z4*Shއ&dY۠Y`&x@69qhr^$x!՞% 1: H䔄G"Ҩn"@rƿ~[#GYpJ2nj6;rØQ~{]HHuF770hpآj˾1 ɣcf4BE E4]"S2d 152B!C!I+mHc)y?PE t9x( (`hgso^r}}ƪDY;Z53!rN buuEeu)YݯqmuE7I3%NDofODUl6{qD5hjljt)^Gyc~av۔l4koo͓-5)^T]o"^{Y^p }pk u˰_jC{N,I@!=;>UEUJp4nSv]wM7/UNAEV6F uJ$B_gTSd1hK(J ;\Aq txS16ƈH˱qA"*ޘ-v9s S&@p%LwϼL`JG9 d>pxG ^"dV5{ Xfd0,z#wAɽ}ہPc1~J:h(`y# kyTFPv`\ hݣV8 2Ɍ͏Mҹu)PkOGa Aƫ9RYД筎.Hk%7qs"EڇNwuW kmIE>,(b $ .&Hvf,&ؖGIjJ[DMJQIDՐ$c~ @Qa!@e I~Ok뷓P %nT,T{2 fxI+5M17ݼzuZx,=_"C3:0+^*q`/yacvoFh: 3ĐBA f%(hVDj̓(v4NHp|4[}P#i?`0(?5-Z$` "cSikZ6S436ѡ(re3j?yHd/-[$d¢vfފԜ iam\TVEyjH.VfO7_K[*1DzΩr(^\~_u%͠=_"S~l?vm@$-$# ڃɗegJ8 qY!w˵4y{w-/_@v`}Npy.w測0,5a;Nw3^8f_1rG :߫PHֻ|Bjq_}}ZjcV~ͭVxviz$yǬj1=V! wX􈕪s|macA[XmϏUĻ .U7a2_^> ?oM%k ﵳ!_HJtd:*/<.rTS; Mn}^[.H6w\w?zb'sFj9`l| (p@0^}O{6T;|{R7nsJ@ڷ܊Eoa6;7`UJ%M>w7F&b%5X`qg9sݨ@*F:\+JЎ:FapWu:,ueΪ kAS=FISvv1J _7`M0@ Qf\7XI$Όչ@B)^65c}]1-)IHS5F.nS{avtCg@W,6}|Ν[ξ3L R|̜/MIٷtK޾ϿB6T 1tz>NK)<  ~0(lqqlltje\ތcˤZ{ WTr+ȭQ|0sTH*"A&pL#^Se4VHAp2 J2o'hc##٢բ*JQG;i۪{+**]F=-ÈIj/|Ղ5 p^^'rqCҶPA UXɎkgb$ٜνXTm"m F8/ pP- MXe:6NmiT%9<w7xO~)),̛0f?q4FpG/ Ǒ]o[_:7T69h5v];Wε,څc֦k]:B7jUP@H9p.*pݻTb=zNv*FhiH͹\PV:g16mkbh=4oDx _&?v[E-us<,Ŕ潟2oy#񥖭?BY*Qg'uFͷYtEб׆ !S=쒯!Iv@FQ^JO8T 8GyCvO6b0 3e2ĵUM*@U}(Ai!blHo|nO8Wĩv!F!q2[gaCm,L4$k-LMA}N<׉g: ^Wy7pѸ`e Aq"1r!@N*-3N<: Hcv$ͻ+߫d\21%A] t=LRʧA0ˬ|`!Ty: &Aĺ֕ǥy"Ruu)jVIF9Vl'ʀ"[F̓Hc0FHIc!ֶjemb5{D3:HpIȗ:wԗwYB$ ĢƇV(2rq@ɨWS߆:?V:pL>84 j~<?JYS0d\afb pB!5 -rO5 r/."@XNf8d:[ onnh4U2on_`9L T\ ntmz(!/".!ܛӋxwwp$ֆەP ˜xv5(Q0Ƌ{B {f]:n†ٲǩR_cw\)7RdenF^?N.鴁2;ܖO*Y*v3G3&ŽS,#ryK(9<_wNs}u xцc?adHAF=in(wR5\J&b|uy4 ,V,NX8qI 0ņj M-D){ymS$OE>߽Gee j0hMY4jJLg:72߬P[}P=L|ݢ}=Vqips0* |<Ǝ8akrJΪ}Oַڽ`0FLN۳HFfKZGH,4yKf FOGX;?ͦ[mXPQF@AILh Q`IfV,洟m/a9NAQۼ0D)>*V|Hu fP\,PPBZZUٲzf(Fũzc< ~i=ܷãlM_z5?mLAp8bǐ̵mBN)kRAPjRh?F JA(gҦ*cIR*>&2MhL\9њBZw_S$ZJ^JKS8` c/OfR܌]}?"Q\F="9(8gϷ|,ٽգXY@[iB0_*hR%e["'%%rb[ZFf=@-=MYO^o"HΨ~S^8eBZi/@8S52w:gC=ѝ&?z1 *5=ʳ,2{[ssr.ྥnB4܍u5 Ym?  Ĺ0ۢn s%Pog$Dei4Olz}?5=27hF70MOmsǿD~kӫv~y-F2k慾kňCrQxS{Op\IB"Jp`0%FIs 5H*MGa9,vq5x\K4g3xˇDȂJqof&uMo݇=)'~ۇmٻ6$Uwrzob_ָp:0iB R5ݯzHQ#J 98:pdMw=WUʥ%(+rTcڸMyevNpO%+遛=0ER=aoU Hrq@rK?v9~g_sѢښD^y,Vq\,[۸ֲȯ}D򛗼j|OA~zk-/)o^vkM>iDF6}F rS uP `yBVAZTdAE<^M,ƕY7^?6avVJ@dܭz˔ lr*7Oncaz_ayjߟ],"Js΁E(C1~@HSUR&S腛v\sBh9D&-M'ī"4Y1MZNoI*C's"!qNX%])2HIAJ I 4ҎͥѢ]v0mm&Jl(ZZ[4NTsp3rgS8Bu7쾁l \-58:o7n2.(3广dHq__e>Zա4ż0f/VL{;-ڶ%E#5aawös #VIzmsMDaO!}ݗ?MF?^?\}[z:~9w!yF?|;0om>#$Ӆ]׷o~;w 7{Y\ Yp듴m W|P`%|$gUT7w7*@ 7^p1ԑ -"%1d颢̰6;$2DUAP΃ ge7"ЭK.`յRr6U7X߀MbD0!AD ])?j=GcEWV^e*>0j?K3/*ûo|!r\f|ٻ0Ӓ-fgoH^rgӚ= O%T\Mwa?xgAWbTq4CFsR15y]d{ 7PCAX=ST1%Ғƨ 7 ;bđw-ȻwQby&JPyW,GGkBT'USQ_!zm#=@ xie= J]L P@ED.NWҤ XJcy#I#|!O'~#$B+6#>8Qw##Fdr4!4JܩL`{Cl1O&bog| GJ@a1pq'CBT&@&$Yg!D9L+HĆFqƐ'Gze| Sf!> ci{W[s0S4IpM%nhD{%rhBb]Y+R<4$}LtVy2p&|; DJEn3L-,Oo&Xi@dgKiKBRZfKKW&PMO<3/Lr.o^ ~^#6ecy?bŴX ,^h6x~j2cUl2,C>I]7`7W`C~; n)V|39~q{Vyc罷㫏-{|5?x h<\K Lf5=<\KR|xuD QR).G/$jzrebr2j7 }ɏRJA/]B F84vbq趓a RZ:Z|9%j@>וݹt]6Kq0w{5ꀽm=F 6Fi#"K`ī<Я[V>1 0vy2],M/{qņ7qRVx$G6l(*g N5艹20FrّP!H5j(TPS*VB{pK\?M5jmBAqW@֌ ePmP)5HDePP¹Z 0p9uߙ=9lIqhnLS_jHBT"xWLAk/`aq9ÂV"aAgj-<9,LX\ JUއҧDl`cyއ>BՅ}(ബ~WL' }>4 wwqiN> Ay x̜g IE Z A3hnAJ cwiK9Źzvi8筝˻4DSwiNK#c*0u8$yh;uDA5NXKH{`a%Z μ؆@R<0>F8,+>:s\q뉲ͣVo,48qFY,S/^ðn4kY| !2K@It3:#4j ƀ8e0\awTh 45f-(R &R; ZGƀ - @BJ㕗H Ͱ Bhu.2o0EĽRr5&a sXW3oXc,y`6I<3 wqߟ}~aaz.d/E|SOP~| z;EHֿԫQӮg޾yw)7Y~M[ nty1@tw7yxfyƛ7Y?Re{iߧ?_OiC4 Lڛ ). bhu0A Pj&([ Pq")%XMb%ٵ9_fėL+cכbLH I*+ԓ[$JKKD1Ns-kV8>G<chD` 8 XV> $p+u .圶G>t!mƒj_q* vi  V0)RQ )\՞k:]1=Ld =!"4(X{ K3Kp5,xGD>[pZg6v!$ )[;cƌLle4zlk4BZ"RP8;Pu;8-iЀA@q5m`-pVkTN04A sQ$3) b@? >XS}4ŽKhR~?audJ0Ijo(Pj¸r  +cB(L@H{s D$< EJj JJ[' ГZ,F8X5oq,au+'H$~X|nl/X6'$&'L=)(eRUаTNRۜ| s.'V` lB@a[o@amEDwqW" ]5X/l'66z܊g" +)5#oASI ;v%`d aJy4r^u$Ԕ/" L/D.&z. $ ER  "rE pWO+2xBU d681 SFh0HVGDmIP&YHprp%f(NP%ChsMNVdTXI9I5Jn G+bHVwdkAO`/ HyO`T$]Jpt+K+W)w5pCiE ZvW@cpPZ ;ZcHKB @`#y$bT$# T\렣1.Th\XTm䬌YV)T=y%XFj3f%LMD`' <HHn|5)pCgɞ!L6l|ΡęF56dJZ * Qv~Zl eb@1 "k2 k$*cVCk$D_@{/+|?5Iam {pQaWyL֓ KEZ ЈĔhw 9ku?"n jNa#P@ rL; ]i&/j~dr9CmaK%fO%8MDž wA>kSIt^i1[ ׈M`/rC')rB@Qn*߹`edWs>1kzEK@VծꚍGB&$bHp6@jyي?=rT -}+9 0Hr5 k=UXp΀mqIʍV ʈ21 %e0҅qa @y2 Ed䣇`$sS< oc P83^MXd?+1:e(T}%Iq"`<ܖAKFN i M+\nNfzJYRXR f@"p`?ޣP֥)1H/fT(T@c,|HTa0կm0aJYM 9oI:G!v,p t]CXxl/ aLX@;+fbrri xI`T-pdI..4¢(άI`D#Eg\dV`8@5EX9E0$l<$KY2MpہNPhͧZr;( M-֬UP'7L 7ZIՂ RLM{)9]G o H5y-&-[06i Jѥqa=젝hBZ>័ѓӳ䘯rk$C8 n=T\lJG'40z:sp[3Mw eVǨ-C1!RX$JV . `LD[ 3B*I%D* %TP2pyP.6? ^m>9EJԃ (L 4**m$ekz !Y|]ϣuy+ EISXJ< Q"gdQg홓 apa21 -BjXFFB6;z-),+~,K6"\ N[V;3"7 3dǚ$I*nSPe4 `2d&jέҵNq:-Q}۷ o{1,>=lJzl{lB%w +4y|+M\}]V}ں-+&hK;=bAc|]_:z{SakV%du|}P ?n1-=|/JcՔYgP T"UQ]ŋTlR7SN{J^g4(XSc.Pmw ]&ϰy)iy-3L23<<3m614ARc}b3)R)RVޯKg!Y].;47tVա Ε{J'é~J4S$ nHn>,w6]68 ^$i$myqAl >8dm k~ly_cO^(jn5yC>y޾Ko\%śܡNU}!oڔ)a! MOxy_ӿǯO^sYX+q7Zŋrեc ,qq,?/gmӉky5N^e|?|T^S·Wm՛w|{WK^3Kvdkx>kW-[/c|3ha1x7OO^ł?",Mop tX˦iK{iRtm^~;-avt8N߶LߦĈځb=A |_=m6 V9ҷI '[/b'g[ 74>|#`xAIH:|I0׎/ ڊ/_rW%I+I;/ٮƗm>C MhOh|BAky:_ Kf*tӥmr>DkUs*͆- ~lPȡ𬍤*^\czޠL0 L~ͦ;.Pn)އ[$2`*(kEZ\4!a%BoB~2MHLٛ;ӄAL,Wф)ҽ MHЛ;҄9l&f0RoBtȠć}1V2?vqJFz7_<_{uM6)^K^dz`+b˃g<71{+mۿ\  Y8 W_ԕhe k5zd\c&w#tu L4!NCɓ=Y@b{z(gzY`=={ֳ;{;۳4o̪km0(2 ~p7^]/W?^,]̓w.Ta\'I/(k\5h[b,'ӓŸM?˳nZ|l;@kpڎ @sVk6 ^4cȈ̶q$qqIPz#|F$84n#|9l>Z4{F$#9nO1iF${#| W?C;] (| NK(}v8;Чk~I>}'py,S AQketKDVI"LlE.\ ow^˷Z/")A9{^~Mw7)RQ.ŕ*(`o"΁^kVɮM…(ɋV,N"]#oeQWpt)s m,Ц$[B\rnj> NҘɆYm0e"J0L59ESVœvJڞ%,6ujT\4.R|X܏dM.bIkd!(Id~ze!stn&i"Ad`.HJ)i KaԵXS :Oʍ|\T[W_sCaJ"TJbϱDx'[mZ?TFEȠCD&Wǜɫ(wC@j!k͊ ':œ"J.5]_Y ANJ 'e,eXD8Ch$cwPw:;@c>WsED-c2GV)vҺޢ۹NZVhk$U-#<Չ,JV,QL>&d J,$MNcx$JPL *n^h6J#먂,"yJD^qPձJ-YTm Tqm)xo ~"Pla(;dj* \m 5A$s\5H[v\bU2М5jk"dEd"68)ڐ!nJt|#☴ }* 2 eCHC%6 y B[@Lv(+0,rKu Z K9D'J`ހ22>m`-)̐`u ]H(2$p0Pym({'mE]A*.r8ۿ5)mwQpC<P*Ш CڜD}Oպ͜ PaE(t\)tx" X#Ga1$cRmKM,*T|P,A*9:Cu =uQE+EQhP' + p+X@e=hu9B!ۚg&LTY] 3jJ"P kK(z A#5_ͳJ6J'0b !ƀ심  Vq(TDǠU̸:} % K~WCn0c[J 3uL>1iO&  9쫂6oleǁbH, 9 ޹Uu0_e1p^[lj4BZmȽqYthV -1WǰrWˆʔN[cϑG1;inlV|1[UNQ7$KTE AuZ$t> =MHu5KyEgrCb=*  9z -UÕ&(H4jiKZV j>/VP[9e|@Cr%"#" jHeP9ajs` $%C ZL70_ r/X7B1T n2 l`" U_\Vd_EJm8O1*V`w<p =s/΢N<"`AD_EL==dKz1lFb&:  fe\,rJì 0Ϡ܁g? թmȉTL()RA:X;ea\9g"\I těA*U)©N'st oу Jt(K/mJ^Ju{VuhkRj:_~6].>OWC,z (o%lQ7hvf cW7fK!1h7%<%*`vrHJޅj==C9AD9hPd .#*XP<@*tTiB'}A=7lիb.:Ḅ`E]("w.˺U)ķ%A̭ne VP'|(Bc8*dEQU jz- ep[T "&~t,[19,RFDn`PY*L>Z.rʆ{%@&]P QX ~BnY-9꼦?k6a/:۫ʥ6K}=ɋ{Vl-yí͛[u\ hZP*Yxk!'ec帣u4oZCg,l5VS|=r"9Y[scoA_/`:jT_=]/7C~O5Y%k|~;q Ջ?~Blg.oodMe}<}kn ◦j֣+޿{4l{`誝ms`:`cfʯȤe9mZ>?n@/;7/JZsBlq9ny8\H=̇>n5ꑫlw>]C z`6~(AV;Ⱥu؋:SW`*rd>ϙII:BhM9P!cv~ާR隗:֧]aMw/0Ly7yzhwZiO> VOĶҸb?7\]IA{m<.<# /OvGϡsyk{?zh 4^ dCXS Z*# &W˜D.YkWJNDšGvMF%-wFϐs@v2 ,9/:7V*YiuW(02.2%#EyɽJX!X2(?ǔ x)\W|D?* kvXZ қ$B9Jv !۹'ǼGHbaef57"3MuHtDXHT'dŤg>"a'w&wk?đeH=j?~}%玴>b+p{Wn?+~1Ja)sw$Kѯ=˺DQ\M1d4eǒET/.*_LU3Xu<M~/~PӅ\'"zv4%am3wJM\/gFyߓtc73;א=<4ɟ~JtzTDiF ^-P7^v ^;qpongy ~ɖ}1r c{2̎я-M|Z ]ۧI`s|qy[w?2e61t!8D曑^lHclq`}j1._6{ɸDԔM+>UsW9w7蚝xca :ඌd |n3З|A[5 14f >:-^;i47 ]3vy*E'ڣFgTz}cG2ʍ^'˪+#LY4옺Å7=1+: Gӈsq[&Ӂf)9携[ 1 @ڌC5txO֐fL7TY@yw^QLUh8ej/`>YH߮L]vq^Tv2V3uSw$vw9w.xGg;?eZ%ӣ 䈹QEvr {pP-Je/{98bdEQ70QF1:4C"3bv3xwX#ꣾ&2rA:~6(%|aku:v:Rrg5PMJ%pX]|TݬB GɍOonu/B[j=F8k{8?LjwN-| @/qdLiKߐ+,?sl4JFR}wr)yg9܌={0y>{eRќjCnUi8i]/>gahm ΘβzRF6zg(E]!9@7ΏX?6'wǧ>-cmxzdjIX7k>OSbq6 {|47sj90/#iaaX)G Z.x wβoz;ڇ9;>04^3^o"mVr$H{i0U/eeDkhOܖ .GN:*;ؒ暟qMQrj9ϰ?e󟁼āNqOo*7c8q豩Ҹ\wBtz|&Єq\YZ G4uis-y%`Oun!{We|-Dc*sQ$Z3 h#5SUŖ/=)#Hy grT8Yi|lhXQv[v?fI_,Oo"}7:7o0ٲL|TGCj4K*wν*hGZwh^5-Ln»-7l$oYa;H7Y YmVkк} \F;dʜ6vG"A 7nNl!D;z'=!#om1]X8njTlZ NuBhplfxt)G_Q!!ęmѬ8iPMr|H8H;y\K HjmsLJlDk;*}~{}s\:J Yy3_-B+vYq&{`5iMWϹ`>pE.n fc^XퟣIMp""@4X^ =o|9rn<- @&׃ſag柁q\=̕z~aDԄQlՓ"r(1i*Ekqٌ<)DQHP1Uh8W+Ge##399Z #|J^t^E U*aOebr,*A&WM!(E"bP"cIۈu|z8][2Xb,iK%`T"QBI%ꁎ%[(NZI(t Q%PE+;#-XR9(^a w԰SJcEtw3L jM2AHDž!A8ah퇘W&v@MN3 "LT/V][fQ;3!F64u0Au "E˖HJ%M5iDۮQ=x_gt`_ ðK6J!*lD}C (U mgJ!02΄p- b㜥%%UbY_oMRV{A Gr\!}EA@Z.kIKX\.|F>"+|L:OIɔe`a k˭I[&|SVOB9Mq18__ңM ^^9UlJ6` $qzIݬ5,@O5[WA@nʍm a+ ߚ@@L`ٚ/5{xb#T⣰fE޾rl\!T!}\}ncyvmGI)?e8֞R<^-Wpuc?,ܵ!+$򘲗ըԀ1&JQ;6-}-^:xO }\qh`GK ;)#bO6QIF -QU} q\'Iy:;ѣ=^@+ Z"- #d/o` r8ϗ qD$0#t!#4I:|oG$Vt/f#y?I&yB 1/_d:\|:U=[|)rrV‰'FdhAo`NM4 "t%Ԝ ~K9YZir?̎wTHH%bzNvIt pd9ܞTwU4o^_U}`#UkLJ9oAj Co_* !ӏ-Zw:"ӝ=QSf[4"\zܝk2:X-.jL*U_4gv~ J<2u2͕/vV h|$75xy`Ni&T Vͭ\1Htxy5;w[CK>]dܶhx˒ogcwx8G:<<BGq6_!ŏpS1 }*؜V@>m]Z P9pt^/ *298|O?$iO 'oo_ !8NTo.UߘXD:"a4FE0"he KaDc4"UQt$a7q.SU|[^o^-ka5\I C\eblx͑zn$1UhV/~:_4+ EdQj4I?Zldo}?G^/8kF%KE$WG,4Q$QvBDd$$sx[!L"XQ1C ^ZpBF0)8ELHdϐ3_g_PԜz#3+6=Ʈ[(Gxf@F#jFGB25*" 4˘?+î sO]68]g1w„0"C!`g&rv?-HBSyyҍ5ec%9]nltJz'42oي[g8nJV_ϧ` `ҶId΍3n#О ,cݽ3}}JżP"扣MJ.n;6޾g92L*uqqL֤-`y GhqG.|9u a_c.'A;v.(,no&;؃c'I].N&BD)#zJ7tNrcluYUxk $5r>l1}"\&8uKg5Gq{%Ρف8}EA:G+kM#Dx=֥J[zt)h9кDU'F#\G{59{ga9Po7W{'Y+Xf+c(bJ6)X`{qgO 2@O't&G_V$PO 4 U`u@Y<E8560*-AwCpe۞ Ƒ.(q ٶ,qO6ww`2E+$v|DK#*mP!"'p*z:F10F1LQH)PED!-iXٚj-ӑ7 V/\=wź0UKMV" MHʅ qD'Z& %2(JiJ$=9FuH@0 0ws"2PW&>׏pRiep;̾Yv0VӠj_݆9Q?BE c16LnŞBT3 ]&B{:=m |P機pL¨;Lh$<1r3*|orU3geOmxiԛ].z`o=J`KIZia8|]6x1+w[n\FQ$ 3f[Dl jDTȾzyQ@1mD 1#W3J?kI!k4Y~dq+$=aՍ? k$:#Z"1O:#Uwcy{_$&꠫ =wsz]gQ8ngUa[H v+|,F闗03;5ÐĠ@'QYVttf6ldkS&EA\alt0F4#8)v?I衱H]ro>vsRw|zHQג9q71͏{9Ҋw|v,<6-]:Ql߮}egny*b!~γYSXrmi VZQ%ρ'<{p1C!c_a\+_hϞ́^9z]ލWwCheKtn=ù1Dr)[mf>kV+NiUo"ib+D.]Z)ҙp3[O#Kp}9 ʾڤG#msL,o*,r 0⼟W8hOM9#Kx5xv?.zRrr⩥niݯ۳A#UAOAa&9 ~F.qFGaͰ(Cc,"W1f!/{ƍhGEŶI/- Q#K^In6]_rf,H##r.U<;<<$%E09c&qdL4z #LYJr61 `I|606|U0ެ`G2!B&VAJEws&1*1S$E!vૠ.,mcdR@ |Tz|5u'nDh,:8;JgvͿ$:v GO'jyra-8n=]'qdvp;ope(W5{-w z.vU[2.fzToXԞ9Pj vPhvw3xzW_b}W%-DJKf6[2N'Qeֈ&sG,!Ri$`rU $Q,EJBTzW)XcRoD8^ Xrm tnj{Sř:*o>],)Q5)NY#!)Jrb"m1~FFs'@@JPzך˱/j帥&p 8h!2Ę'iQ@̱/J#`S$Z*!I`0uq$|\Ȗ'ӻ*I"Ɓ(!f"ƍ1їR7bJX 32c/Z\:_Mw?y|%ƫMkt}|\j^Zu;7l'@B!DzR'`8;;X/`  Z8D&0`N"Tly' Lj uYE> j,X/>%I r4E/yrj b`7\ &jCK-GKsVVز?ڏUT"JѕH1ӏQ v}_6,~~si* I4`=%:' H[·oEUCfzS*8[Gz+yvM`iGzs.Η.DD58=U8 (fZc0߁cW>z8Jɻx=ϦFLfB?kOZ} z,S|&#GK$;hڠ%9 Qֻ2&+a1%/AJxhC~^rܐ{!TTG LAx$< >&@Gmlфǽ;ݛ7cƆ֭V3Z5Z%ئN#V蠢kؙͩŝSr*K^љNL6ޮgnMΔ<Ҹv'xıX8* AYcQ*"$i*#0ɠ6%_)| 1wbnGVF/xCngMè(y&MBݡLwأewSexhr0֮wأna3mo@HKuQy-ܽb$дb:ԇ$Y[ʃ8UuV0+*y|y.`֝0HwFN?س.7ax*7 QBPw~`hֺsEZ?ezTx/S3ʆ̑w] !ަU;`qEUx.U&Lh֢U)C/F:7YIAS Ju]C:fe=N7o]1hƔrA ٳ84=Uy]=!SK;f[ϋlvmEC{2}[J%]Fe)suQ7dRw& 1x {;jJ5z̺v>d]qTYw+2 y" S s8/֋ \l @z [n!Դ1bA? F- +FdOL%lwߞY.>$䅋LaFU-ҡrUϪE$#TA?7i[;)zɎ& ,7Zq։|I%.H%wy^QSvPv~}RQLZ02 P* &Q`@Nt{$-"7 J)NQ3\plNc8bRIk0eKiv&h@8 8"p>˟E{Q BJ]ɹxq *Fn\>aGrT6Sm:^ؑT  G V&qm?i! -k$Ȁp6le,QwC,Vۄ^>)8hUxPD;5Ga%cGoC!pPuV%r)ч&-B5s΅ˣ.ղld|x{K@]#_C (Z9d9Mތu33N9QE0 D$DұIǦ,N;R*b<HJ@{(Pi"a DSZQZKU[vƐ0g5q>E|4TɢyUNv A.˂^KL%qFu^N 3,H<9;OCa)&D<2Woq$d:oL!@Ȝ-Jh%0eyM盰PL5Da8E g%"T&A,3)^e2=܏(6Δp/l^]_E.Kk̑xOow≖ Ҁ &ryJ dDXzy2ZG6D\([22$NVilԻݥG,9 > 7l1fÍ1nfCh9]XD(B0$ U"e(1BPIHr `HP.߫ѵǕe۪dcvjTd#53"ck;w -ɍOn}rc쓛}RL%@@h(PH$@&4"+X$A!Qڑtv~"\r$ԽmٚL[m41!N(Cd Qg!Bes0SRAFzB'nc^]؍BZS*}ʹx9Oad?j}Q;<纽`،J)o.O[ (&fc-kމ:tׂkn" ݳ-3z=F BVWR傀]ɒ%'"pƗ6nÊ< 'ToW@=|=2AXcJEr Nm VټM `J_<}ؐnTyys8i6wjtZ2L=VEjY, gK<'mF%'C!8Oӆ3_qgC},Pr֤Z2~1Sd^cJx40ۚ|vvf3u''!|Heo!sKH*1RYa~7+?n-ee훻]X;XMí1?1#x4@mV:y?/4CṗN6ŭ2Apg&}m9 Qw iw*9Ö5pP)uwe_ػl*/8zVJ᡼A@x¢CBsiMqu{t#>+Դ?w\JB归aqAPq dD  F,3D`ӦiI# ts+ĬsI<8nY9v<33 ut; F ~WK`,{x+Ē K Pr_"@X &'ËCN>b5CS~V^#`(@?>aCdpI_%,Y\- j؅ 0S l0[:س 媻d=p\߉nnJhD*x= ^{(c] ujG@5fB[B)PRT$Q& ȔsBX)O(4A,L"p/ڴ _9||(,e 0$B0U*B!@bRp&aW{NO}H$iauB]Fx}*R\:Y@s!%wH>Q? rl`XN *9Lpu؝Z~J3eR4&qϋ*XIoުhŎmnjm@g4II x; VhoNWlWu0 I0("a f)0<0Xp0P[GDF⨚u K:2_qt,XPp6;,]Y6+zYT}41;/3۠H*}Ieʫ0$_0 ƱfePhnn (;y c`Jw{U@\7ᎂ9(FSVw?C]d#Xqv2ۛal|wʬ(4kx.[i L^`||y153l L zx_jЯe>S|85ZL[V37i_ªz͗uÁ[{ixq*Ja 1@WXQ5 BRP jh i(l@#ƶq.e@k`\Z ḶU˸ם ڃsWO_뻧("&76,6+m;' c >[喕!'ϭ/z0N8Z)PNkT~Ƌ%(LI&~u6orEQ<x4O*LneZF)#Iܷ$CbڀDj#y(;u #w:?xR!J>kGV 5S*HdmlS[0^5@N Q .q% Z=RZؽ7E=jhjϑm zl[Ki41xùikMmJRMܬ=H,'(t[UnSĐf, !;PWOGq4;P9|F.>|ĝ~R4P+-bNj3vkm$BHH0&PR!9$-Onsւt Z %n#ǎ͕U?=U+pw2۳[v1vڬ6/ީFo6/ʐ/bxiE+iXN`mWW _t_O`3pg%=+QGK@ RN$!_Fɔ?li7ٍbQ DtbF`_3h'~|Q|k¼_+X=kz/U 2^c*̳\_Uo1\Z^@9@J"+Rsĉ3|`V  \c9uHc(10l,1'YG3vYELIP>Qz/1A4d'y!nd#bo,cJ*kUttAaN3 Z{&H -"i2 H܉M(NaDJ9C)ȅh]1r!u Z+9ofo1e~Sd3C0Cp>hmU8݃ Π\pz(F(v8dbJ(^\ps޿fP6u#GLjdc$č$¿؛W܏R'YK' bݡthfB 0YhMS 3lHQd9f\k|"Ej+5c Y0t`e41 ?OIB=[mE*X][%ĸU@cY9CcЭR(BV4v3*b֭atl*+s*؊+1k،p`\5# $0v>q16bbmhLq:}vLBE Ht"a-no/b n $+(4Źbt -!;F6)nфj`0Lc>ۮ:A4aob\q;=cLL0<,{&fDeȌ~*!<<,1#ra}ТYhOGq4REF#sXK =Iz1Q![.٘@NLCGΕ[bzFÆ,ZRɗ !nGVjH[#`km9Qk4ي @1st(C0/-}`Q!5kvc$Hq䋝`s ĴvRBT!5H-1YafpC%%bI0y=!]cސZ!F#c ECm2ajhAv>x@-&2<,X/K*C3RfҼF|gqѩ3yޮzX^v[C;sgJs^4첓 6R ɀwSŴvs ӂ 05HX%̶Q֦n>1RB 5 [RU 9䴫/ٰ݁7( ` zwl4y 3`2=<$\IA]Vϋi|C'sC:8sp׺iAoa Qq uGC޻Qq <5e@.߯>H a1&NrNp2X9Ytǒ.g4ΡAk;"^~Sm`ϘF%uP[̖_|Wwe叱a|wʬ%YܿULk_w 9i@avǙqY-'wPHG@m,Pƀr̽MFZ l5iȼ'9$ x)3ᢻ2EcPƀ-Q&yPF޸'eUo<ܷg5tj[FڇQٽ&+!Tⵋ>|wϬP)7i[ޛ_M<Ҕ/R~ҫ&jBőf+nW{9=M4!D'tשڄ#9@ g Nu^=]HkWț3*|)œߒ~ilīCNhV0ll~"0)XYYH4fe9e*AYY'aF>ʜ|$eYEzn_$ł #}5b`VnQ[<<-8+8ƭQ2KfcIqLe!DZZΙh*9tZF !bE1_bh/%<,cBx+:6f-g-.ftzr_ erqUV_S 02ZC?F]jn/g7?Vk8>ћ6wfs[>ܨ[7Z{K@ x^psT}qms80>Íi?N:-:;`鬤$J78 $DФkY x}- NUЁ%aHJHjVY7@eET Y FVc!)LH:>hE2LV!"J4VTa/G?&I ƣ[GEʄ5 "BJšVL`0vrO06`XZz$,A[bi9ZŵWv5= k#Mr$#q),Pup}]bIC0{e#PjoLV ~7͏'^n=k|[~vG8d@5|QWr +XZJ*D*kV+qŬ8/>҇|_` l'dUxRi"PP#B,:yAs0t(0'G:hso \Y$^'E,84KCN Dϵ' !8e0 @r0NA 6a? .r2͢A$]K vcM1t/&)K9`  !:Afj'^"~SK}X?p{QɹX\ϴ}C~|^j?xCWj?系P$TNpÄv¥q+b50f?<5[,ހ?ޜ5ϚY[5%l,v7g>aqFqܗ0i+6ze;vm:753ҕy)eB4ڎ2OWJO0|bjZCL-o%T4L466!I qf~?c3ÝөY@Wn;}v?,=OyjYTY|h~ynCWU>3++'ZN-Ov5O*)}o>n7Xp ;x@< ؏$tju߹V*G='kn")>O c q'6`;!.LO.*u\F/ϟ>ӆ`_r"aPv6Ё0_⫃}]gI?9T{ 8_ rCyN/~E[c(-3U"Ϊ` lYtG2򧗚z%<Vjw &=vy1H*XVA-㥟>l{)@`:A İ iř;Re1pg:֮z@:ʠH`-xՠ\~Kc jQ w5LJ f1fXqzu[]dbx&#d;o2LqM}atP Fz`Aht)zē1yi?8֝]:95@Tך){{5Cu'(tf#췫I.i!HXC`EJCPzc#INس5iaӚ8ŘOyz:Z@=GpI'=zPEd EKA!Rv =mgxrX^ *d)\O>6.Ӂ nPwN)aF .{j8/Tt ]+x8H^q mdeL09)hnW"Yi/htcPb@5_r&v IY%nTܸ"RFP;E%B/<~rgJ.N!D;nƑ "3 8] /i׆+,~Cf;Hnwހ(#K>Ivf6~d,mVgHmv"*VSC!w"+u!k!dOn )08yj8mF3/me7L\H]f~ΌqFrqz.GnFvy #yIɼ&W%yr2xB򻂼mN Ҝ^=p 5Ě~^g kKl&VSPsjɰ@^z?sb͙>Xs>/'u9E w,114LX-B7.DÿSX}dwbǖS0KᢛHp0*X'ph`|l_[S0OWҭMdG@h Ԇo3gͤJ$+ekD&8`ϟ܆9mO4|/8C)DNgL[ HG \y>ؐ\J~X5F]z^=.˲|kh}q۫ŃAx-v;|}^.֫UP+e'cBL|6?))Q<t>nnYaJi˦!%+inav%( Weeu-@itnjƿ@کO{k_4^iqr K4p)lbf\~;=bXŧF".rL"3 00h*1uI25c#WxMP^diI{YǷ[z\]pGx7F|eVAbիaUS_u]]zUl [NJv(*yV^'UGb̙VY$0E  c':vP ۲pԩ0/>? Gvؙj/^]?t]_uTl'Һ8ׁX_ {٪B&5~4ׂ0˔ކ0*!r:1nk^!>OaC}3 5TR3\Z iF  T" B蠤j~.s~ KEdnԎnF(z7,?V~&w'2RpNdP끨ƨd?0491I fi42qr uW,w1j I$3rgvgJ\IӍ5#PIx_1XFcɖE">/F7,@I0Zꜘx1MƵ%+\&1@@f*g)Xc;j1z ’!!{%xY<#vzyy`:}&oBd]Uz쀡W6M5Ad RPmrTV*΀2RX, #kc6<5RD0sL "qD@_ 4w:Fqu^hS VA5 q$x^OInҁdk4@0Nu9" EL<ҀA½e%nO| s6! ez<LeuǸq1qlrdzlQ. gu4n牦1{Zt-""-gw_v 䢽 $#H0tz{f V&f~kg4n{d҈#ްHR?;i*t3뙒y&Yr (@/ lD<m @)s2!_h{,rUTC9:LnM%ʹc ¯qRb~O`!dsg9BcIVJ8Pbӎg$!uׁX?YZbyrmTý~#[Il=M+| ZY;T[w}x2He 0#ZL*D2 ,{;~3lYk4w/$U2P5} rٝKu(2x/{xN2b/t2hEwU_>_S՗0?FjڶzF fjm[mq7q0Z)ՔRzjZ6VλJߗ`T -M-2yY?iM8ύX-^QC-|e ؆ d`l۵'2nYCyG-%Ѯu<>KpTbvMuum ퟫ$ ZNe1ɮY|L{شwK,܀HH>I^I'^À&0`@Q2 u ele‚a2,=Z:KHNN^sYis ^g@Na]s[=LC5^>__fwA`& a8$r-r! `;Bj5;㇔ƴ#!(X9FX8JˡT4C3$$X8?vA1W[>>1t?yWbRKFLrdy@ePrw0CTvDJ f-\r$9TJ WמH\)YL*?o@θQr*iȤEbxEv#\ʹ.L 6rEb0GP¹ *gZB/\eHqr?RTc֚!)H~T;2̈́ RK+ZKM- I1IqB )Lưqz2cq^а^0,*3NÜ2PcV9m{BCm!.Jw'`f(h2Hnigθ LqZSC3'j&J^'93 [I^I&i-ߏ4ܻԽd4?CQ3#_K8n׆Q F̤)J6E,mHTDx%z+uK0'ۦޭKog"ԟъc^!RQέ*,$!߸&C4r8^<0q6ܳ)Qg1mEmH0:'R^ lA>|yCveC0a3/S!CewR*;YMNn(ӕZ )F:O L99quu r]s̢,Tj;Ovhݻ(F[pT. <=1Sx6QYDr`\ U moR1Q'1l⚧l<!AcAcޫB<Zz3]AE4D&Dv_eBAC.+dh-j]ck lxƱ(kcCK~z>`\Ƙl-6񶖸KjT!Z&Ȯ' Z4c-g>4c8*҄Ki}{gao;{*z|/]t7M{7zʖ{צLJq)0`r`>= 5vnvFT# M_9ijpĽoWQSfq%`BYs',͘>[oCA~D@1ńA~#!J3D6Jy(Մd3.azwxO9bZ= wf p@E@~h.Bs) :Q.)NT_5[pd)8B\N7% xqٞ3=*ӳ1  ztmv┭{a )鏣Ң9Zw|(r=DN\ʭ#"' dm$I nr0penBeI,YHZU!Ya"WƕO[;tNJĪy%9߰90Uߜe_\_N.f'+n?8p4vwtO?{uv^>Uig@^if-v0B"l+ֈN?lzHXo#H4Y٨xgX#LU3/yG%k:@neAj!k2p f ƵR)C٤W9bCI"5BI | RH$P<kbd5NIvBXXW) G7UmK&z~+z}P:\3ձ^hѼ,#Ld\ $ud3%#XccfJǵ劢YD<{}G!ق (p1 gd#$ Z+t5uR$'zd7֔QOY"`-.jR0̃(Sr>0RZzD>vMjƠ^wTq9?\! A]dÀ8:s{LIrC2X-qYsY`VDL!j:ZQUۏ?_O>_ :* G?Z:cwVדYM=Ŀwk eS͆^v.{1?긙k&o~}OM)PXdn]z%݊7ݺ(KqwI:O5)iĒ.. KXҪKF%mS=KZF`I~d ,2 KXb(jյKM]QIqܕ#<&Kh{,}]ٓq 2B)fivWFbR{cSFbKhVe$h7Jlb*lb ; R:/fpHxs_mvZŸ{4QiZoӋ Hs d BOGT!)b{}Álƛ&Czq'46Gy$4 ~lРRE卵O}<=u# 'Zb0$/ZЍ'z!(J5 +B^ |o"a1W8n%D c n ݪ|FAnB # /BS8h!milP7>n/6, 5J ~Fa%*A[[ 8Oþ`"#;t!d%%i6YBY GlkҘ7g7tGH0b{snXP;BCy ժF{N*IyַI EBJǚȦ%\ѐ7~&yQgveɕb4V{Y4xҩ8$/m8#<:H#Fm3䃉F:oCwIi_ nز8w!FǓV1%&OZ9վȩMMTd tk%,ߵZs`VpyĢxr\vbO!ֱG@s>Q8ڷт0'-𼘋\!؇M+lEkpu‹ !]@vD@D6qujBX-Z[8]l{ƦN->9>8)Prfb͖R'R06wLu(xnZڏt-Hub2'"SĨxBr떲݁Pm)~Ac}K~P 7Wco>}I8Ӌ*/vT)@}I\'Ni"אxD2$rڬ>$Р3Lhȡ=2s.dP2 RE6BFbbf.y0$(F`_ӰK OZPY X`9K䜖1E ! SO2KZJ Dח5{:cHUQ MK /5}ww+j[ǻTߑ%r֯3蠞LAMl37lZfW[Ò4;,HaX{ətk8tJЃLI`rqK_yֲP:wrDMCw]B5 <&=^CBM;':y;Nqkq&%]BzR`}!(Ev"2 @1ɮs>hP 4%C|N BxYTxC/Yڝ/X+ŘJN~˷Ey% *5m u $ZalrBb $#`ٹNxl!X YZ9 h7>ZiN{&Pj#ēs?T!#{7\stΝGD^ 80k,ȤBT%I$ #+vIÆovYYǥnH7`{{ bDR9c[qlKr>R:`R t=!kE[#JpL-EP dĎdD^#:,?=^$䣃ѷb:B*:yF!& :]44yh"g$Qrn {ޕb!Ѕ)(M ( jPQbLŨ%! #@t=^+#QE(F(qɓh 4D0uӡj\~NXY@m}\ȠP_-9|1h:Gx,Iݏ6*vaa90b̆Yݔ"㠃!]C/7qTȩR='"n\Q{;9$yc5tظC >]fۛ@oA Q7%v~0sf`&[O49 x}>YRV`NJ [Լ, X\ZU0|50$axDvBi:\}jqf#gNP!m粝<\1Ws9x k( `Rʇ&껅/;cv!kc>XDH+ RtĭL(fAvWhd⯻O`Z bgÙ"b'g& sv,|,@3ÝHC4C@Ԟ@`.8Hgj\JmسW3س7y- XhĺJo$Jt E${.f7*$ *16~r))6yN@2z+o*1!$r)f'۝u&^-fN=z%c%Jw~sVʆknO\\eL ->ZP ~^]ɯVDpGyqyĞU8}эF+1ʉ$4~~R,3:'{-X8?kuO+Bd!J7ι~OۙÞ޸yW D)[==ZS BNHs``(>NwiT˂?,:b q:G'!Gᇫg3ڛe7ipl{k$S{_d%%slf؊ a/c[+ny&|>{egOJ k\z2ᛝ2Sȅu"+O +Zzq2;1sWen9XO)7xe4g ^{rB!Q;rf8> 2k7*Ò@Nkkv-4,;&QRPT*#%XA`ڿb)nFW8CZuwRgČ' wXD}&V *GѠbOW/l9HFsWAԻw/7u=A1DFj^Ops7B?4w',:ZTȪ[ 2dVtN ~J% b`,RBE^RΆhk=H2q$eͼƖ7!b$r QF`ҀIʓAI1M>oTU8%9PuܞG,DvA{daPT|Aly{ipA!y1;hxQz+K4r ^tJR uT fsq\*(k5 N>oZ|s?X)"S0c;?A=S:M]XmKma蠠ƚwW?` p F׃J_x[$o0cէQ{֧RRSR!J/TeyckqVEo_SU{MMS댔Q#r\ȱ)ލ9bdE!.ʴWوN5ke)ܗ}U~qJ#*`l&$/=c/}#zğszZ&kh`(=гY7H).](HK(8₉OK:J! \o隗ǻkrʏuEδކ(9" oF;T#? >Y"T /bf"k-"4b b>3bB#s >)+1!%eK[#Ӹ `%/_99Rl3Mi4;p FٹX)麏?;y=rYl2\f%ӲduaRp˼u8)&Uu9]JPW|+A]eKȦaP0^YkNF-Wfp{bb[BMFy^\zym@ r̬JM&F!W5as*鵌gt>1jW`tBda|k ,s63.bc%QVha . Б"X8*ߎ}}UT?ߪzS-[[DYEtDAt!;5䃣}Ek|5,fq#2EY1N{-~]mY3m~'j⸛P{*ދ򧸨(ޕ6q#2OIR}Jv-%*~YT `q# 9ocHYҐ"qbrquO 2 $jLi6x@W՝a<'/4,QE*Cv.Q)ݟQL'Ya.kM[B+[A**EPԦR13ЋT44oOLmTPx[+tj0ċ] '?կcM[6QjCӵ'FZ6?ςjX*ps l@onV{ba+Hkp'm>;'M ~Olhu\S5LkDd隤;h5MϬo8('+<ݹU XzU~C^ӻ5!GQ:ڏѻbPGubﱋwoDyޭ 9r)B~Ow&znQ{g 2y*ޭ 9r)-~wLHdA=w"Ӿ1 S^_cilk4^C\Etʃ=nCnQ{ ڙwTݚА#W(ky7AxX N=vnwL .gwkBC\Etj}j{꣊wĠ֦F=vw{jLnѧ5!GQ:%>n.CnQ{zeL5ϓnMhȑhN1q5ȏGb1:ŻYW.y y5[r*Sz>ڥf:յF-LFS tnUT/eNΰK$2V6i 0 22t/5j ڐ}SVj@]zᨺ{R}O\O T'LS-1ԂN5>S#R@h0(zq:3e*SQK /,#)ǜr̍Zjx9fR9嘛^YQs@ ^Y-R(r̪<䘕D16k?ᘡG@oVϟl6Ͼyo_eMHAGg#0jWtTm喧km<!t3|FɈʅx 9<.^{% Pȭ~VxXιY@4-31 cBX(u˭Be9H"1J>XXRx-恕kIg9E[0EtNB "D8tqq>9,"D1- d8(9X}a,e1j0Xռ1NR;W`0bra bH0M xRKBhVpj2p8V9H56HTGB!$ ; J gJX.bJ*499a.Gx˛[,U$__;zg@/^E 0K1|LbHltzcדt{Sio8!5'/O8T K$@*X0\;'k8\ഡ uȨ[ThP!C'kW0Ae8&&4H.VY=U` &'ϥ ,b X~e-܃p[-}6Z8Sl69h74NG-o=6+7/W cwex m}X/j2c҆~~לrh,dn߾ o__ 2/2g##:4_y=Xz\mmݳO3fn2,ft ׇ>1ROg&>J)ĄңY٫k-Rk`DT 6 plE òٚL jaH lr*-JBW2bPS&0[A%7޹(HaÐo- XbM0kw>l _S;!cx̼cwl:7VB_O |s̼:yy5UuNW֡_B$ϭIHs QRBL#VHT!t9"{ChՕ./KQDM+%*@t+O> ;e>ϝ>}U65͍uUݹ%Kf\~3sm\٭ [f|:}Vj5uz xRR+gWWa˸le+ʽ䩳/?-_/s|K0q>+>Ey%Lߕӻ}vpnP-UiȞ]|ʔaE\Y& v F~kR7o[l QȂeYםB<%z 7I9L~aNz5b ]R `1ZLBXXͻ\€]֫umc};˖fZ?o}nf4 r-'[]fl>e{5g1 *G(@A6 ?*gښ_aIe!gWIb8UXPͩ1M3$eRe[pFh4ނ5KW׷iky7<|[Ƿzv?7Zɫzܛ\]M+gn2-鬄z';4 ,&VZܘ_gؾH8g6 8]]K3 >nl?8~Ja~4IWfZ?tγ8#A%;/'?x9mHY>_0 c D*--0*Ɵk );{ԍҋwnьfzڰZK۷G3=3&;` x׃[plsXL萃:,jt\ }zNӿL5Ԋ^϶Ûs,LB;%%%5s7_UѺib)qn7I2<^҃I?'y>,F_NSQTfy@ |eH1Ėҷ25Od~ Gc_~LC4%n?e*"HY)Eqa$C3qC ƜoƮvp4c= /Pu+iA,pC+!TE铔6}{TA.gJM=*sYX&l Jz3{χ`hx1_g#} Q4ps6ybr,!zW|r}w0:ai#gx=~lpl|TN:c9'gkB0zbά0fd,i!rRCBZzo pp$0^vw\ 4x}v`|~C@u|Tᘐ/b8 ȶ2`D)0u(%,1 _ZLDJS͢^" ߰wR/"ݛoNKU~K6mЅOw#W' PPںJZ*4E<-xٜ7Km'dsHU2̆=?]93@5$q͕I!9sA1NCvciLe^~Uc:3q֙2!8׽_ɯaz}&4 UϦe;j]Exg`<9l@3u,9"e. Bj)ULXAx*Vp=Yt%\;~*@iyXyM0LI8rG1J%Vf & d&Z7~:nde yz & ?j1ogkԧ.sDq5%t),%~1_8xUݪ}@Rİ* >fE}Vɛ!ۘ*#ϜDJđgO"EzmOryl5XuL""}N_`Z5'͝tc6wZB yYzTݓKU`~.P!>i1B4QzT68DQ)N Ow ZRp!n[N^K_dm{'$ic/!!k т Ĩq JM\!_kAѶՃE&JA g1KWu J8Ty-2FĈ4QH q~k[76RL7#0+^">Z$`po(o(א/\EKtcu#ɊwKw+ GuJh;X\օ|*ZS8W[)9SFYQrwڋn]h["R_L j&T_~x*gqZy]6lίٰydöY6[N@&7*?2 m2͜!O 8tӍ0ɾsnc>ywÛr~,o^۰%l\G{&$;Y\ߍ204fKF.9"IfJ%ACPX|bUY[-i+8J7p 9~3xtWPHw[udv}Z}0q?ϣ? $3Cn"Uk Z dTa.FFQ>2ʑvx¸Ԡ⎧3bLIVf"Kw};j`ݯ\~'1G0I,d|֎"CSGfl޾CC$JQ( jৈJa %0JZV{sXh\ж zxިo[xg` !Cjtj l 3bИ!p }2^)1ՉܞOc A SqItE6#h9;?),pσ”1 y K l4$AX+˕No+2Y\ohH Qƫz#K2RKorcXДoSVz3)>uBPHy(YY1H!*7&9&A: Ļx*{Aÿ,84c6D 8]D݅ww.Fnjuv:}|~I qEU:K-Q ;B[s1XT(I,*yn~r< owolwA%=ߑr{W[IJ3̦< tYSfB,$*1SP( r; `X*1ae gZsA4r1&Fx!rqa$i%jH/YBx@44NVG gYT"vq#{6~lGivQsmϨCRoI@uQKJr*hʾJ_Ҏg縱iQ2QKUD1+);$p?k5B, HfIL"!zgS[A@N[/ Y?5ciN Fze .3o` |mK[$pUt, P:{PӀ?ʡ"K8 褉+`gV'b4waMEpMOyb[ l-QO90k5-OpC ZTH_\e&Htz-@˵EZZ|G]_CA("w~_⽃}{^~/l,cߗqįb{I ğ06mNsv8r/IEj8L.g$V^Ny Ԫ] )j9{4 W"v];<@,C;yX8RH=C>8G.bQ0Z3@ c2&u֯4=2X F쵤qܞ=B=Sfl~N BJ*B_ @X oOWsW%ֿN $huzȅH ywJ(C.G8SnB tsd k$tټ1i~ad+rEܑ{b)ۍ>g*. m_#͕>}t]]cG2&uo!wW,b$wvX Fu7h۳GGwM{;Z*ns'_T{;8s7:IEhwaaaa4n4Dj%!UVX]`Bhm$V#%p*- wK*ڍ.m|kLF󒭿7|Қ).14BlsQ{=:S."\AR1wsJ)Hg}6Fbz3}ѣ4Q|&'w&>dt; +OAEg6 SݎAEһ^/18!uh|-iCTʽHb®=tx5B'etoc{ tI }_a@>)#S(i/)P읂j(j뺳t~qz߇n_goSnPb_m(ns( @xwzWǫ #ʙ.q^؍ȼj2Xp)2qpi=y#\go;:!-ZG5٤%L΋eEf(&)"j;^]Z9d5W+  ۉFuW]JPJWGGԙCl7Gԍ%z9^ac\q,d\]/DNWK4ڨJVYW5͠#3Q{MNHs5j骄ku` 9-r8B{K|:_]A{0,rӓoO^b]oOwiP5 ӸoΣXUQ#D/JB؈? $a8y <8~U\&Y))X6"~K'fky>am Fj0Xi1|QV4jDZd㼶 g^ABB4 b\"BQ¨&vU̞?e}{Gy{˃3٭ ,OͲ~P?Gz;C+].k90-yr!\$r@eE kmw 䕨ӚHIi$clqeR1u\ix\[= BpDjXO ,2B,i&ɥ@Z@QUI.q:/™N"5fY+jZ, ̻i~8Znc?/"坮fpoNCLIsO.5.8yֽ9' C$.@䂤IվOY˟kCE:/$?0uyH0@0:,N5OA2CDj/YtQK6)dxvdT6H*zOILʝK-$ $JXx] vhWv{Iϳ*sQtoH;?eBX> wq.嗿lγ?F0/,.2.oޞ2HCxN N~S8{3뾓o@s˷"ùpzra70RqzBpBBh|=?^]ᣲXk˧I`(QnЮ{\;+dmXO ]7\Fo珿hv]w_ɏz54r`3A 5GNߜn)wQ*!~9Mޔȱ\Uή?[T [k ӿާm@B%+W:.FUO;ARW_t\;4U 6p91i}X!"0 z^`T؆\9ӟuSڵ^1xwuv\<xF5|0k]}';ِ Cw;W/\_PdŸǫrIG@6TFDo)Peh6A+E-},y}.EZFA6pqPohY:zng,C[ަaJq %E+ldy+ΒJzCEԌz/YU;2 έXل0_Ayݟ %u C`(򜂂`bxTLQA>hF #'Jm.~sz,ڀ|~Yߞbl2FOC:bqit ]ewcKȲoB?*itpb8lÄ|bRףP6|U8|5D{^ܛd\324 ~MejCÀC._K Quc;bUꮍpbK::g"t$VSp-wp+ob`:-km,iÔ]|XDنw.⌡tlΣۍy`E.Xwlgn_=N~{j.C/ŵ. Z;CTR΋)f=n|_؞7Y;*oP15^rT@AG,ZeBFFȲ~hPX AVJ= L'j7IÞ"j(W}La eC;bOv9 "҈Cge،qS+1!ͿwxrYNwY;}W+9IOFC 8$Q K (2;gp)ժ}Q]ڗZK!F7Ѣ\bpC5R0kwܶa.qkYxyԄMnZ}X|*/8kz[Y]m}Mx!ì)?$ .Yﵠ-x-QfӺ7̫;\yniCvL 7E9ET553'F;=ď{/7ߑ-\Zo&' w_=97H|/'wOio{qb V9{>|l:4[} n0#*=3 F})fӣaQƩ:L.CA$oTύxô`ZX!̆x%uX. g_}z]?o%JU0buP@d!QΨ Aٚ:?)<@H8ul8^|I t=1"4yIA,AjU %o$q^1 |)`Q;#ċNj)={wV3~ 584.bc:sse.)QRlF9>¢}DG?: IFLgYB•W)ROH,xcQ@a+^ $$)HMT4Ƞ9:\Ԡ#N!ȜBbX'gN5EN%ePr\ D!zeUrDYP#\oEo1P6t5ULZ#BF}IJ$*!ERx.TwU}UU]{cT;( !-}IG~T4@A]'ki@xUƁu:azszíu]uKfFoqm:q8 ?w5%!1R0a$CB"Ft GST`ΨAf]vyN)>WqMop?fl?G& uE[QNqډ%+ ҁn)V^sna\bPVa8p3)Z!]a {`A[&Zf`1 =\/933 [})Ζ`=sy@s} PƸ4P"_Gg=o{\WM4#(P tv/f1lb3 0\ cHj4k@ Aux` 2цt)tu_#[y6x_n)(al5|@`4VxucWqRh ֌Zemls$noQa~3 S݋-Z02O͵qd\sT#SI$Gk)heSuT`2#*?"`ք3LRU\LHnrT([G a>&4 2Sj9*!QRqHMW k7eY#'k Ʈr~] PB"3"g}r'(,jr3c]5)5)fks i(ÙXGf kl%9^6*}5n-"H!̚PLD(t'(L1WID.; iBV +kJ _欪\(MK,V!  4"0{<, F6\kZތVPpȕ\Y%6mba sַh2_;>ry%{ӻG'I矤]8 EƝ}U`,jfXV/Bu+Y#Hx<ZYc kBS1(D&Ww> .Uc Ϫ l/q% -h\dgƋ(6^RB,SL/5 xCOYً2QQ4EjO)17f_ "4AÛD~ )ge\o: g\'O|2ʸ.R#+ud(a $B!;\`b8(c\* 8He q 9n'Tp~ ݏ#ZsъkWJa2(cZpJf*c]ʬ7H5~&Q<%SJz&ŁI-$ ﴉUa#/ Wc1sđY[-JH19jU/`sYyB2qY9y˼8fQJu" v$:bGUJ8vJsaQ^6MP E*TK!7O\8ܵmlm|mJ/V)8SxҌo3U0)Չd!B8FSqK,3Lہ$3;>zwc8އ cbt_:UiibJHd"ٶCbLB2.J6kg!,R~AA>4Uys $kVѩ݋ڞHXjF윉X޻7D+!!>4UIs='jY猨eVWS:Y\@hEr'ōن ׈ɾ+mQ޼Ј>uط&=bB:s4w;p&. 'Ak :G'GkT ~>rgS{ l4 {^{- |ԾEjīt:/ FV;imR=401 U]Lzd+w*/c|l׸Qg#%~h@5M<Զ^aFZ*/:︗0_>ɨ?9&'ߊ˜ҙrL 0fۻލ&.bO0֡"LӦN}/чQ%yʭ:ˍNΞKbU8I$ڇB;ɛ6<`qMt٭ޞ FIrVb-<С\>54@S'c>LRN͒ 62Tr!{+;N.q5(RKsnJlP4|I`ׂDxWLma;04 !T4& ,_:Q;3BAX[B :D\UJ}*Cxvi"vb_bJ`ڦ!p2\#E\&ONߍo=?jpRP!9 EQZV\ v}9l3ԇC5wQkB MnLpn nq=w8kjBB7;[AC5Cs7* b/L8Z!D}xNҜ:DFiaHRb_3Tj)6Թ^(:\*kݟȞ~$ $  :XS(Kb2Si2 f@6X٥[%'}``Ӟן4%<0*nk2)4W+O"$PDDjCk=a->C̀wqMGaBsDXUR aZ c f]O>Y)Z=3bm)j3m?ZI0 A g@4A&ЄXh(11HH3Eu(,FjΉ@Q|M^Kz,&ŨƤײ*lEE^-6:`%_( ь Z۪E(jm20L+DD(!v$ublɼ$qNsZ!)fD1(IDҦB H,8s`XT&:4. Ͽ\ %Ѣa,WWRpYU7x0꺝7Œ-n/U=kzkqh4mc&*$kH^3^tɳ)[OgSD;gLq2%mfv+nZ}vZǸH(8Rn:'iZVrp7q[tb`m7wa6k,0'-Z{ Efsp* D}H/}Yt nK<^2~Z MpԲɋظIeFmkh Z 7CZ+ngL玒:VYgb;o omOCPHN&ho`0n`w]8SbK|1kn-zvp5gY}/kMd0v{胊cobaԉ8>|qh@:MW}8}b >&1W/3 C7skvMӫ>D6qݕ7&X9kxƆܟ0,n|8`7/!{` Z~sy~ ؚf oNv!hz/ϖ tv0iri]_tq{uzs9xK\8I Ɠп/5H'˷g]G8IH=;=eFpQw}ob]UoЛ.{ьﻭGPM -x?h[+N㤐{7~ Ƀ{W^oGUb?Ʀ{ 1:t6a;x䏸&!r|{e 50A?yw k _zojf<$FOI67`=p[vZ,tMN;0NSNdwC"X[vCs"8AtR˿$_Hohd' ,e{;!w5$k?oLN7y\?k] ߠ ǯCrz}, uѫ^h̓'l]kx`d 6 qRW~΁璘^\Mz`\ADb7z/cGPw?<|$ J>}3^FD}|'}w9>8n_8Dibw`GYs3973N xϾLt<į/Ʀ߾xkWw5b:ϭ*\x'Gy4+Q߳2Nc2}.f$_M#[a9v-0?IJ惰۪c;yRLv+cR2~M>_Rb>Z8V7ût/3tz5GQם ~MVd}G[/OŁE▷\zS`.` C{ake߿~iAr!qWGGzk˂JĚ`Q5/n/ 9P-.!(ť0B3Egel] 96z!Jj|"fteӺRF4yfWЂP̵PNyZ!%U2V@bh&TFk xNUTF02̵e.d!Q,JB dT8ldPbhFKV2g4︌-`TM"0X ֜ VS\g NLZwީRBqk j &P/Uhy}h-"YOϏSզ" Q wSgCY&χ)E4Gi4R묵Qc2jl\,K5V? = fvh`m6f"h mmjKwmH_0m#?%f1@f&dWhljzMrY_X,p'XrZ5=UCtYo8[g7o͗B$~TG4i0ln2J*(|epjI#: ch2 =)䱪H419peOOȐ!͟FCG& \u=nq<"x7````Ӫ޿̮knY^MΊ< 'T; E^RqeXN{. +gu\ 72#?weRz3yy;d$02r ?nmn?JˬLn|>"٧xHa!LFz6wjSHM? @@Qj%O[Ĕ IBGn*?eo*\H8*؎TTJt +e X aP IlO;JDd1FEdm \.É.js)yk%HPQmpnɈwVTKDP.'Djš1JJvlulululuZjn|!)T !J.UA%+:K^R(KhNENjCI6j0^n^N&;hcIٝs[1C;qQ͋O{׉#q ttٲQg o1n̖[.(&4c]1& G:1{ʵe/0!5W 3d40 a\\ qk&{93kk_I7_NoU+ȼn7)'`q5>4mDZLkȅ[ ׄߢ M>=6#,7ffN UHW{uZ.,zsvV]9ߜ:M z~¨1:S-Z@H[$GDpLwuK>W0e*fTfp9gkqv=x|] ==Ȅ!?9~zЁ0'ޞ5# ]UMPct3$"9=$}'Ǫ}JtiN:*Ԙj5:Aȴ9$Q%\C:='~(zwO<-)._wدa-heȻAMuW𑦗Cjx'c˱:w&)AŠ~2ȀKνEJ D}Z[$ YƱb呆&RQ(d zSsbH=b:zek Be? .UhPRH%2-ȩ)a.>v?ˉr*l$W )"KHk8&/nT5y=Uȳ۫n?LY^f[Za?̞*2q4R8;~6 ôWDk5S dG'qR #Ea|~ZyB~ T8m^~-nybj2+W$@%=,|vqwFnIxCoFxlp{j=N4NJ[+2'=ɵ)SWvJ-dr'\+nHRdab,.,C<V,<%G]^|-wsN9fp H <WWl(G" %"~:}oó`Y<г*]`)U&쎉YMi dP߉7Md]KJݯ*-@ &2b1(Xrisg'K˓tuI$3~z*0T:I*!c&EWxp[JC? 4O~5dxVSe 'Lit Zz\^r"h1+ 1]2* q͉euqŽE ]9Nw`T'3+Z%/X\SΩhWὩk8rp߸7l~2pqq6_nOY[~qO!'yϐ;{zs$8T/!I{x8C`{zӰqmHy?r<bZ{OFy| 6oGrCceVc?fiysHq0 } y*ig#͊Geݼΰħ0*Q Z!Fn ѦtO0^qO_(Ms~^pw?9xG4&T~nu WQ8lLXl '>;U"IJ#ЪmFO9تj4mф=خL3z,a0tSmtX2Z=kU%/PQOҵPNgZUohtZHzUݐWk*Q OYJ%J"/e. \{ULUNIm[2M$XOS`M$ oS[-! B)s=jcQ Oѡ$@UutEb1oC4 9$U_ E-4z#7QϜZ`2P $&<#Sݡ»%U}]^}q 4Mry?eQoӥ2ﮧunMoo0z}!*"1 kQyJ0~|x9]JwWX,_ &٥C.nj0UF_y;FOt>]\:oiSNpn-;Wn>烀d `p ӷG3MpJiQQÏ|PwJ(M0 UZFq ( (0+1\ 1Wz0<8 e'-ZS%+Do}f"RRGNxN&0ƣhF=cbuBLМ#b)x%j h)m8,G|':gݱÒ1vx=5ƴ[BHńUtX, W7ןFpxBjh;+^Q}kf|xnB!f۫.n2lM `8OҾߒVU:nZ+<$._ O]>!IR>5]&0,~gAb18ĸ8Уj~pbLB!"v-hw ̲=! im @{HY]16DRa%ܦa:mǶk^䕭{vL\%Z'HiRIۜ7fK,rߟߛ*I$ ?3 {YMriNKڬGiSޝ ⠍0X0: #'L?`s6ąpd 1WX[/$4塐,+ R()U HA (^\GM7#L4BgEMCg][ˎ,5:*F*N4c)l!@갫NKWR. DJcKK-';F#HEpt?cKT@[JVi- rN+s̎#i6ь"^Q$O|aRNfۼ ?}I\K?I_wY|f6A&O n6H&Eݲz'+-S!b Ҕ$KR h QFN2J$L4WHsk)RSlf RD%O ) Jb(#?6Ү" 88u'z!?Pi~[=WC9A>| SZǩ kI8\PމP" 2 *ݻ^җvp7'C h>G )V=eƮK1SbC ~Gs! )/clOEk8_ -?i˭uv.s޿:vN x.Wy1_}''cf3EǕP$( 4")ka^ Sq\GH0֊ m{{QF߬Fa{%^\e![Gݬ yuƘ۽/ؙƴ[̶ܑ)z4ϋ٧ڬm L+ Tq!0K<BCm8JS 1ff@%niC7c hA񡷔GxBl L%n/?m'=+>׌-4>ȽN1NUfkHe-)[dX>)nu^'ߧv5H/UדleLϫӛ[-L+4mQ@l@bq2O,^8XO@v"B?ccʈ1II:(Qc%*bJ#1{G|A5I.ԑ Zp!*!Àw^ KRq>ǎ m7!$"?/if??NW[r)|׬=/7šU <(ɟ9g֋BjVH~kaKnÿ~ƋZ7[lB Tm7fVfdx$tR^Iۣλjz9k'9^ -\v[Ov;Qt@ bn|!7Ͱ65Wőn |b{,cc?!`4|hX8M{7Q^YS2Ve}2cls{U; :] yڼCc)h Ţ@?V,@ ܰRrlAaSѻ;ԕbXZ“T?{'k1kXԩBvwhpn=Oo0O4lir A~HxFw|7/9:7Ħo9֭z\snP(Bt0Ew4Ibbp/Iu{";s0< 9qSt+%zzMs ͗zDm $9IhٺuV%I#zfVvgKnf7qF8>nYh8&i5|QHHh4РQp83^z(N4SM$*6R]:8m#fEZ9Em Lo1pdqᣏz Ұ#XEv {9/#C)#el}OxDDw춢??)I{`>#!ZEko0ȸy KB$`TC)RJS& UL˔Iʉ=|5Z#w!he2HL n }kӖO(ZeIjnA97ʼR&}Ud\*BZ4f6\$ ۭ0CVͮ *򐕥Y`hpc]W2D ̠ܬ4@Hj,!\J8e0+DSCS7Dq},T9Go̮VFagm-HP+lR~atȼfz4UHOĎTGE`s-7|l>Y9by%OMQ|%xhCQ[P[m$/mD#D2_L,eʌrM5 k96K2`iF!)7;PͲdu f\cjJ*\(iNN*12Ҕ EkƮ5".HD8a'F۱ePm9\S.Ž'cQk9C`;! o1~5ˮho#ɽ۶zRkNOm׶C`Kq{suٺu+ڝQ[ Ϳe], 6ԘgDrAx`_f@=~U[Nć;O \6Yc7k('^`Cf8޹l{b'T/ 4ԗ.d;> jkԗRLOS!oQ6{J Z_u7R~#1 CUːQ?:o-w7'mXښNwZDDRySִG06.h0oJg B߱ |cγe1ެj ԓ᫅fX=c:-p=*g eT'Ac ޯڍBBB1p:nZ_S[nu['AcJ[ȇڍ2 }vpc8iҭ-cJR);ݻ#w={~S\p}ZM2H+׋sJArn236';`tyeW04dt{NCStfrzP:v!!Bd ~:Ws>\G]8_QsW&7?T$ !-Xi!6:hiMCXwW k?8 *5ZTXp@IQ/k6~-\q+u/Uf0%GSt'VJS#~1(w#f8ot,+f[4b==je N7tSObh77'5kQCr%֑C' ,NݧV^~YCND4ٌ=>f `xG-wZg+_轆 'ƴ˥c$‚" s-x w7(X 66i);(NwA"Es:t:UPEпtVq \ܾT,6_eʞPeEh 6B&IL[߅5ɢ=u0`Nအ./4pP.e8:X8l FӃ h{zng@k0@ѸynYSjXADϰϩnF 熁FάgM>81! Q{+I`\\JV> płQsGl!Qy< fZr Mi%a*`bn s%CDFZT*ZB*U.,Dn% '.U +y_o0( %@hWP#εUqO5Q9Q8(&'+Q&+F1R.D7CM$CY\L'XP$K$JCp U2r<&x_>LDu H}S^]ϥPYYwgP0m o* #.O?N2# W}|l?1y\)y\1?wfjZ~gFG xL6K u|{p=fp~3M㱇2$D5zلcӣR)빟&PkG2k/A^vJP= ;.~O$DF̦ѓZ,-揣"SΝ}yOAm[01oW $C o 4ֱ0|1iFH>mw!3b/:<9ŠR/fMk!g5;{Ki6o`^Nh{' .Wt ydDeg[{4!bh3x7Yz\ ?EO.i:qD٭NF{| n2*4I͞M e娩! ,')=`T^ EC^(YwYt6zYޫg?n 0݄p4 qi+آQ0$4N: hO `:5hXQBr1_eH:o; nF\7ba)cH@RȅcJÎ +P d850|2 *MsN<+?D3./h&دbIy$=;G%8^Ebm'ּ.ZIʨjnӝ5o^ ;+MH~}X;"MB*{̲_5Uvw"frWFb,p(YQ9E:Ք>ҡT" sA1&F >UoF(h@7U7rҬ\uyr҃&Y nŌwӻ3-/pYxpTsW&YM7 0T1#V6lLHHǃtneAW0edfgTTZDe$h<]H &SmҸZLVzIh rP*iwmzY $~ Xl3d_¶ȖKr%ݭIĖ(u"Y&Д<$LS+H D3EJQ!vz֦NemHo?a t+5ruH]΋,3csB]̖,Ҿ g51LgHUc,z'dx#/}cxCaߗBrӡө/ifhϤ(TTZQjlc1䶱HZR rOd6L-ShOJơxT y~h_\Etn mpvEu|QfvkyҜ2\Q٭ :PY:K{A/r$caMV 0l UO%tX): sy(k7.$  8ϰR)JRaH2#ƲH3")a p2޶K1;1(<,I 1Hd b4a$iԵ`crGp$`8*y2DE/_{SٱX5b˕}y;Os& ۮٜ /(q\dD#pHQc5G(fzDOZ 1 J[)taA3(S0~JJA N=bڝǮ@wW,AFCJ ;psL*mȌ3-dH$NpJ:I+"䈒,ɞS{ݬudjS$FYdX+*љIclp[U^,NT.FV,ǏouKr ߿G[ WT֗b*0~0/V|ZN&zp?g&|6\?3]yȖSXRoAMrLP;ADxK)3G[Ñȴ R`\RSki-lDtgwF2 _goKJamhUSw@%exs;ro[B#+b=_}G ˞MA KXm[ ymBQ5ipd.{F!=QkA$R4@b#Πd#x d<8+)Qʚt]7qWv5C^o[gG` Vjo~Q^- 73K+zh1O^9 Dew7d6 qD` 2pISSTe0Cfs4Rbc޽9̏' |:'eHtv1ǩ/WhFZ([ .sւ/[ #2a+tBc{o4$1%f㈉D"3 R2.y b* rlo,?NUsW#x(@S}ܼ$]!Cu&[üZ*]9v@Ux4pUܷԸvj|,¾ר .Ũ*ǵ2CHĿSRD:(ֺ_̇6߸$Ļw>؜RǮС :sҳ\Bt-0sVXZqLg,J4X; qA>:O*|E@, eJ{YLaQePg&U/q{XF$M&Db#[(i7&$ˮrP(j#48.Q)q+VP+TYlx,\ScU-m;{Ztނzޚ`mD@Mdj%9ԟxܣo,p6Mm%2[X6m_ }~@~BU 6i!GO6^pNO;==(ũ5$OщMϞi׭!pS̭{_5s s {{r411oIK[Na1*IJXQ"(l%sa lM2ΐ="s2_GJ{w>(=Ǘgm>-TJKMEBOK}x+ikSGM ?@)2NjA21%Fxf]?bOgl&0%w٩Yw\;萖I~͸\j=n]2߽'kvmڒ&j9DE>Q'8xJL;Y{:_JcB=g}p}eFm Rqۤ!,_Tˢ1^i5+*8S6H}/i RTTVҹEqн Q'G# [I3,0[f4jӭk' N f]겕Ê7a*k~j@(,D6₂#QBBl/A!"m5R% $rUDŽ]cj}>&)ǫbPZDv)8:JֵzXStQQ*8ôԙ aCugce]=.s? Ng+oǩ#pQZtPH 8 y PtQf4o~%8NP%uB1C@uCLm j0 o}a@0 {.P:?C90ڜUmA5h~VZԴM8eKg]BB ;2`hK?7"$X/9-. @Y+וX{l[]ev?̴9ge:3[FO?}yNIelPLU_}F!{T[0_fԜ7QLdԌbݔburZݧ?,]@[H{dfGWUK0o?qO/.2[vrP?IVUbbA+v;EO:ĨC|'rDEB$z ֓PZ+[9<\N> qQ Pg\$-둛>}}&Ǐ/zm`'bEm5EV)FU.FFHEK>}cCGtP kNqD%+[olϵK.,)z-il3oiKTemοW~"o%Sr!?;/C>6'8酀A3#1%z^j6ξ_x6}QUփړK$, H@&*,KH3EH^n/uJ=i5>Έs3` y_qǡIˉoUIu'^Kxҋ*%F˹Km'I$!j۶ {yU*ҙ`vdmpƜc.=Cΐ_{tvb3H@%w|~1%eL'Vdz_b!B+;pyC"MBag;tw'%"'XJTXc3(ԝ8s9:gaX,\e;C;<;0;;Q: qĞbCn>R@[6~gpѧ0ȦBwvrq`;ڑ x=#jx!b58)001NS,CDe1@I$SSHID$qQ"2,SʮRqPʄZ+F.X9YKN{Ht CWkŁx̢,NʨH.dI@Imb!@02fbO5J'۩sy:2)3^o,24yz<16@U^,BPv9?~|-b\~x{o5_q7S[_Q]Dϯ5;}r=/V|ZN&Zo]8g3ٻq,Wz; A2=(PU]ݝ_Rv;ePw<$e|*0}yyz흀{8$H;7O~Vٍ%0.f^O_*ąL8R83Pv#i(no?yZsˉ*(;̓k+XqEo+V'{^,jۇ-&rH.\lym('{AeVrR$"Ige{w/sٽOy`=Bv3'@rh]2RmuDh浰_t\hi*$+c<20y*V,gY|L$=kT[OZFQ\ޤv/:! 4iG' c3_ -ś4khSI% ]caCJmNH Aw1ů &ejRT %2[MR2q; 9J+}jR>f󺊍=Dс8+9_2Jf"nƐHyIz;s+JZyMpf7Vg˵M$ʁ^nSG3I./,{?I_n|޻y_o0,Qy*J+*!H3,Q9Z0ĩʐBZTF1AS_Z|-Yxn,/3o%Fhݟ! ?/,h@+3c< Eqx&<1\;Z-8o MNQP!9/lD*5 (\eQ\fQL2!1|]$9v.vBGD6F[+N%# ]+b"Lyjo~{[%D/l9ޅWlT㺂Wc+1)l_QBWnu)_[  :䏰4sbLGފ&O?-޶$Ip:xS>F%jD!"Z3X`)cH%n3r +H@$Q2E F'8 3kuBF RHrRI W[FiҬ0E9ns(1jGjB&3_gϻ +\6;=7}=X9[0-.@Vac+t$l3/;_NL7lGӪ!(K=W`5f=ӷX`X7ge"7(GRL`r.69!R;2][n5̞Bq};uѯ6\C8rwuJrG~~=6td_39p썖خ-Fce14sCwNH35Q+ ᘭ&AECo 93H @PDnŸ1(RR^pPS~ h%h,5HkliU^Xɝ QhlI#28Kg*83Z-]/q#kg¶۬#.U4AaO` \X/6 Gnz/~~{6Ao;5[!DAA^B_Jz~DEދDx]Ƞw@]tgBwD*;Ky`{ l;Bn{ ζ~˵Wcζ{p϶.nRװu $8E)Tpfsz nZu369YK8CwwGwBb)t׼#JTQ=NG9=ܺuJq * 3)uf+&o/rSw4uh zNLYP)гUjP?kn8|}{SKMB\GvQ'J.8$V=+b8o@D [Q"6sչŨ +hL«.پY.|ilR@ 9)2D`a?ڈTqVعװPO X$?_.$CG(1pOZ{kt ;lC "ߦTOaVx2//֞B+& ݵU\=@S35La e&a։]ftR0"(sI*u+ǘzRCY4E_Z+bah&*8n|8T9q8!נx6+"S_<}ҿȑkwQ_͢zE,6Px v,g'&s 02!K(zMDtS>V԰#Gq(nF/Vg7BeP1DT~j3R-2%EQ^0WйH*TR"hjY*2kyK]ˢ`8/)" SWCF">NOdNQh?7,^`Qf@Ox90-?_&@HL ZLQ lF d=k}}J$fQ4QY'ЗHFc"Z,'yX< Z ΋){ZymPHgQ Q,A }P4 ڪ"4h#+A1;Pm*$ER 0S!F\FNW2!RU;fr1q4xV##ggRC*4U3児y*DMf2gFE)R'|@@(iwhv뜛^0 Mѫm/+ɉ@iU R,3FtF`)VpuP$"7e": a1`Ze9sC:hیk~gځ{BY{)/ѻ2::Wcʔ+1LYQeuZ,50#8\:cqaD`mPA=)Ȁ:Ky"ILH6)2Ja֙:YY!U@?1 r.sz-J^]'~9 J7R[֣ee9;RMf7v[ݏ>y̭P϶ ""A4p9oalX5t'/LT/]|d;Ɲےq ӣǻ0Q\,%D= tKr^>p8v);m35VhjBRǿ}G,bl'~YRe4OvV8??w3tpTR~MڟL"QbY0j7)J ,V@v#rwsN'շ[o] d0Xnjo@. Zd!+"43gK~B)\JAFFVTgKԖt_.UyXl pTB{mӣ>{j[4/`-ۿo/GJ_MoY1vlfÄKאc!f FN Aܙ;VCro"z>cOP!s . zP%FC/^dj^~ޕ02p'xr࿾j|rOSW_U֣P؊>e\!!њJʣ?}Pҁ I=HD%9ޣsȑCqj7݈lCe=U7n^>R*b);Ľ6ꆷ׷7T,mE?^sv3s%֣344):H8)k[Na1:б`ݿӳCcҜxSs̯ۇzDj/BrcwΪu:*9qb:a4hTe`mb}f(81NGӆF>*4p0hZ jvnbB1n<ȵ{i,*$L4I[QzSoӨk:NSTMٲ5OC3mKL8V(-)AJT7&u]H(= AڼZ2 Uh%2zf3[`O{35cUm;A>;;'# AQ?%T; MNK=u-;wʼ% GATעmg 'cJX?5 e|{z J7A* +2V\ ʍH#U"#!8@K-5늯z43a<0CRSfv=oЗj 7ו3Vc<ò^LsUSa)ta%y~qV?K{Gԙn}iono./ַAB JZtibt̊)sRl{ u-@d;L}nrGF>@5;gS_+Pִ&.jԶ%{}^>X:rhˣvh)dBGuٿҋ#Bjnq\~a+""6!8 ta"ӥΕ+eJ+QΩ,ArGzx£+"+69a֠|#xwgd?5Qqݩi*j;3qHlXK͒@jh`;>L\xxY諸ƫSs"[Kq89 %".ϗ q62ł0 6cC?.j}4XNSmРxN\d55`8ƈ@'Q'_D*_ WH1á}i'.b`@VPD}8VեBb 5*C)*$FMwoqx_jbLƴ/tgHtD`:"u ו-#KeJ$4tEH! QAuq}:QLRHT :V_tXƄa@( )2ÄJ24EQL*g\IOr*RNL9 6QNVh9^PyQz%)Ej6Vp$Y %+knH/E76/3)C\to{bf3J  ks12%t\),p4\G-0Uge捰x=&6̌۠&υE0&HHBtWSimM0ҰY[ x (Qs R3!S֪ZwYlӦO frϳz ,AdH/V.m`,ޜ}¬߃^BeVKeVC^o{"Ʀ@3doӃ^Bΰݱ>B,]xU޵ +s)@Vwخꎕf.wcof]$aHZ2eFk6?iލ:;`&US^ 25u딞 (:'5-ݫfJ)I>J?2:}FP}z'&ikju/}N/INMGo:x*3jYAHeAUV\@s+Q㎼#)|U˗xZTyY(UHTxUp&3=fMHRO!$*8!֙U0nHy: MAV!ᨡY=x.\LaG  4mhrh74!dM7zٻeq*L9ޫ>@[kفC0ɄMC⣋^9ߏtOVѪiއU#GS66 -iUAxԄqz! Λ?|(q+9\{^,󗥃y]܇cq쭧\?fwSdؘKee^1w}&?36&OѮ'v/70L8˗|O/0v rQ}]pBuAGN%W+Zw5J1s-2Ҙu$_cOGtEg7ڵɣz/,sBQckh?'_MQtN\T'$}-򓻼.e`)`1YHxY J2|J#9\ޗf ktabs4 (&h\/w=S^ͪi͖M]Yq^j/dqzʼn:NHBx]NT`-B]-^EdVXĄv){!MNL4YUe˧x/im, 1v9;!K_j0V)a/ {/1gpHrMg20(q۪gYޯDZѱ_ߊGb3mJQOksY%[yNV TefŖ8U_K{*\?YӢ [QV4#ZK pdU.b|A8: Q==Nqm# L#Y&?6CR`gXfiaYъa{2#z j+2TeDba}_Zdú 7eCkfi筴Mp"P&mzE345|͘suv'YXO0XS]0Yqy&ٗlz,D JI&~C޸eCcyeH=?@.Q%g `8ԧ&ec\q罌|?&F!4V. HH&"Q()Eu"thp})T%ir;Tf#m^+c0^m<."c8eE#=$#!FC]G 54 Bp hȃH,i/S b|4)-d'KcgL2֣` աϥ|h*B qA~*Lyl/˻wuwL*N6|n:ûIϤ󟙹q$b|퍀gdK9$?I9zS'`D諛s_ fYJ&u\pzBҹl ;ヵI8 h{4>_)ɹ4y7 #cu6SYL)_MMm S;*c{=[ 0zr5JBr5հR\F`P;^n2JƓ /GFW(=jGRtCsTڊ^oۨOozb)g=*SZNy1`Tj}%ęEFϷnbMkSQK^53&S6+  M+A}tr'W\ÎۜOJ}JΈm(iUh$jV]BEj7*NCNE((B;ش ?KugVBlY\9xH˜M0gJ-I1ӼcF<2*@rnd:(ō/DQwb ʡ~}R010zrI}?0|&k0oAFa$03x}&=P2SS䨌7TnS_gF R}nu1(2t>u3T ٶnSfnM!ZcL&營>ަׅM@?|+.)/É:'__9RUe&03[+Aygn]:\ZK%];?Tލ%'vrcWêF9h|bշ<3|H{ܓKM[RR4 wAl# rmR/fJOOs*RC-Qzaӣu/7q7u)uwyg?;ƲLϤ#Z g5\MofuZ#'pׂ"Ҿa&u zz򒐖,ENҠyH4-)$jW U[#K5RIq*U譑YdՋv.Z*+oAZF# w6XoG[BU-ֈeY>Fq+Y6h4,s@Ĥײm/H٢{o4|P+H["5"ѶծJ} +SEVJPV[#NN2WEꖾ%0*mX'{76H)fC6<K Kt MqđѰKZzoغS7{ʪ6eHcպ6lUKXݠp38j :j\,$d{aGYf!3 F1B90,D3$3éNUF8W4V mLAMvkXQ6va"#]/ Y!FI0#=4i"F! P^]/ ̥N2)(19>H 8G!LvYdDhZ>?"6 GirIYrB7W*h=,9VRI+%ہgC/qF!ve "IX@"/|`c_)AO2˖2W$r@*RIq}jWa%qFC tB`zu2fB0SZϘ@$c0sYEƔ][?^LFy8xݼ 1u7|Y{|ܜx xUy%Q2~njRy6iLֹ*-rPOcyN &rURН%&[0ug_[1ky&:6*: "rE=YOiwxo،իFM~uA.za M<.(vp87oYfmPs+3$+s5Rҁ :DDhz\h!BPr*#SXRޅ嬍翪) $`wsT|^|BaJ78+CJ }JE8D,`dnu!'C|L^>Zscr m$aGYMmZu3QZ\̗?n.cEa8 WaSey7&~N>w藟Ǡ 8pi=܍]k'k>fV~fz>FsA7gݧG/wa*Db`~c/xK9&e{>O95x)QQ6c?1K8}ODQ<Z+¥F00&P]%wt >߉~J7Sk8,g;q{f//t:N$`G9ꡕ8\:)@G^ l¨jǕB- å\I5QH^m xz0 &  aMK` ?AE(p@qx!7@K]PVoiN}ʀYE 0U9*/|IBY% i , W`?h޸dʁ箠+Ep8roӕ\&&GKF'hxSIʏߌ\Dy/u7T\%s|w⯒Z** e3Q[h1 i ;iYl\!N 1;g~lՉxМhGCkFtQcZ{/F{:#M|9&գ\NTv60M%竊9a] d˄tI0bۥxD|r.;hcNn_M"҄vhks<^Ӡ+hM%t᣿'*wTrncW`Yx 3ķͣo|h| tO"_p=ph+onF_?H>TOJGm[902puoc\I%@ ѠQD`^xTK_rߗsD8xށrx;8͈G3ًIZ.KVޅb!I+tJl`q@}MIzQk؏}"BAK"}c1HEa)?Rަt"tz9 9EK$!/Gr~+%%maފc/s&y,̵UWm3.KvQ`&;(_e$gA[ OG \@& L"M%hQ%ukCq-<|q՝wz6pAPGRO@[uTFH c#fBqf|bKTc6B2UW[`uc,"$/qkt+ *\wdE#Uuv{O~Qkw'-KىwR,/BESR _AvZ肖U^֤\5(}F^k3}'Dɂ5'udK<@2NT {wQߠ7<]i8}oۜ7uj O2W Xar״d<\*E#JR9?1)6H.k,ujSrߞpy=XQѕLN{ҥ5DEpRNN<;럗5HF41'|Yz^]ejsfҺr דQPfFDuU:sGT,DMG@3,w4[ _!iwq+M埱R kRTjj[/)[|F([85d$*sܡ2ţPQ%˧T`ZhI )Uߕ$RDIAtREPA"41\gS) 鹒nr*٠TaQYJ2 |mAGE 4ġTPKqJ ՞ŒJ*?⌒)aL6)KK.s]>΋NRk&A,Y.AX: lI lTm''! kSƶ:}u`&XF0%"dp|)ϖ,3]?h~fu?cgh,wƭk7\纎y k`vʈD@HlI0R? # 3eEN@Am?qbةc@{wW>tCB[>beVG>"j[ܮtc}X|U<7Dmd4nXƝNDܩw=&srnxQy 0+W'`iÀxx???O;Ã+XgO~䗻onO#ɏ&:2CşoLr|uN̛O =p A^݇6([JFy$|^ [y n>Ϟ%aԃ1s.OcJfcJuzb)(ۛ.^rv}+6˿]tys~7׷oK|}|tO^]\~m<'y}vcޝ9~}vwry}D}>,^9ZCsN]ѧ#^[AhfϝO_N?[Pwy_jyv;Vߛ?$U6\F˷kro6 +kteQ_.a(&~jt]n?ݚC IͧWǿJI6Vk\a|x-5w2x/-=r <;_ߥ?}9 =nRW'tM~vMě[դw1Fc&!gk8v·:^7Ž'9/?zQ&`:? dx}="Se{l`46-o:8,SM9>ΰ;8,AZAFzHY7TiX w+k!jx( LF.glc[̜ظȾNxO8hs} ijFs}3Fmf_o bOqQ6 7py翵ԥ:>O^2~5l;Vɽ>eS7 -`H[ʊx4؀f&UZ0x|@R"9UҐ!jT3E+ d(}2{ @g HD( >Rb%ny`% AQB{гV6ҷ"U4~=ndG,FG,YG))󿟑ߛ(#rq24n;mI-@+k7و:x Ui_8̖ep7xV=ltN3T4Ba 8aOe=/,mʬQ@_ҹәxu .o 7>m ⭝Z{<̭h*zX90u'<"{bwZ(@ z/`e lydgR)!:aF !(@ [%C,UQ| h@ =Ba *XBO 0PtŒlR]g&l-&c@bd7=SImeV s"M1 ΪlXM1 #ewz5Df3xv2̗L1bkd-L߁و5f\t2:j揭d\C(ΖG'?|.G Tp㬷 + ljv_vԼv?bN6Iw?Md<]0qa1 H`4H E1+ -!oR\1^@i[xu%Tڲ+iS5ln1)u%!h6Z-),ٙ6]۩VBqx>DNۈm! Fn"ȧ#D ڿ㻣*%* /_?=8+ emXCk w*j$wL׸u4`ΖOO ;W˫#$r(rFэU-pj`k,d"[6=B4硢RnE\G[@5ZOOr16nu<"Bjp=xyUD1XU?XcA碟*\`LFWy){# ^R]xJ3g#_4DQ7H xvxm{B9I.]a'#ERʴa fƁ_2R֌Ut7&^{/W*[9V :%åD}ٟ)C%I6!HUlrĤ(1 콐Rp.GE ^{0NpLl {LrbBIie͕{׭@(J%r%Jr%⌤zzgdC Xp})ouiV ~.BNh3r$(dǖL4k4GbQ<\PjܯlGͪ#4z1gg:}/7ešn66O*.ڠ%ʾk(VOZQDAkL_~las~%núO@5/,J ZVQ{<^LQ7:`x-s_nTYrϒ \x, ֔)Rf,~$=ijhL(X y.2jLbQb 3&c܃@zpvu;>9j&|k }7N[ 䔜7n»U⦻ٳ|{{}>Ij#5Q]P֊),٣<J_ز ֤K3"PX1'%]aw(P?-+<Z .5p!8<[(5R|.9OT>jϙԂyr׿߮sׇeP94~bWmc_c#]r!j}x{ }x] C!pP}L" ȰLGLiiHT%)* *O~_nWkq{+v ~#,i%+,ZRڋc8ab v)đ@CB' h>ңgt.0&󼞚p1ۇ7yBH2CN2 zd^ԓv"x1gQM07dRDqC1N 74{Sj4E?:Gc3% AqmSLӔ(+[^=.^mZ^1C}T_]'^(L>/*U}$jY퟽\~|KLTW֍KlN8d0$湁mſ7D!5%9Z\<^rR@WkorV6oj1dnstΞN~!!׫7}_dWԘ*[d[e =Gc zZ;ԞoJ1SKzgW󩬭c4︳ΔAɐL'@\tD .S:ȨrL9 \(ko6oj {qQSƾdVpl !6w׬n~1{N WO?o/g3)|ïWwwsyI y]ˎsgmbr"eCMIA$RY%TcKA'\5L3 EJEMN>U,s~ni t~ATȮN5B Qxgz[oPBalk(ТkG4e^E(ѿ@5LWclö́Mos\/\RbkT8wq>r?]N߻{.]C )\Q\g; ⥭wߢh. T0̸<)CR8Q(%g,MȂ̶UٶY 4b5'\KN)wXɃvVg%@Vrg.pOQ{ǖtfV iµLe\ ee)IdI{@=|hlOyÛ;P"H~i# =І6QKuz_02ryxf+a"#P`# w3ԭ#@0jFuA*-y!clzSJq xIU 89 Oz{.Au?vgBO\>{9,3hhhhE-tKLT(k,hH)XJXDZAQ>sՙ{Ĝ,5v{g!Lj,#1¸Pdr0T誑p ;Q7ӽ#.Snnu=x1u\%OnϾ_(P}ԽW\׵8@DG?*aL".D&sswϓ kPU˯_|̩f??څpˋrPl¢KU\MK'K>wˊ(^̖fňRTVT %Jykt޻w)Id]:S4Â,Zl;OsG[?u\NbvrAtoJo]Yw?DzkswL̤s<JT Iu@-15D]Dq&Q6:6d@)3{Tm~K;sj}oxypآ.<Ez<SBaDqnsVH-{{>T@[=A8P[ h Ĩ6FCaI_ S4uI{vYBέʶ_9u/]ΫaB.+W8Ws"!%s(r!yHAhowml["Oeywf_všt}[}BB};{J!$|]~:2,SORS Xa^Sz:'B].l>((>9Sv Fu q &j 8oa(ƾU4PjFg׬4,4TBwS.-"f3QbiԖ7_ A_/O"Q+` ܟ4*b, 3k۟_؟Cdj?RȏXx f3Yد>T_-{;r)y/PIa5a< `BӉιn'woBҁ! @j4223TZ9T2R$(I{L2+ hRrס#* ^3=YQ@89PTdtp<:tJ<tj:>uR Q WYk 7/$0jOxy`j>WN/3R"lA#dhDtﺂu-ł̾䑵|w2LԘFR=cnES:.sQt1AlY[tE ;ӱ% zԍQ:9gpJT=Y@np-09!wsqA(UU+g'\F\–0WT;UAU@SˆM $·1d W&ɌJ03"¹cNKj_e̙oVɯET({o&fCgw4EM,/̨RaKRh& rA=( LL8Ai!"4XĒ&zZ0zwG2M~ O#[3y#'[Z;8)ȯ1<0!k·.|PeӖ; Ty_`(T:mMde95g#33 ^aeeUˤpepy 菻+TeHRVlXXt0ު|J5l["KY}#QkOōc)՞vfعP1jOfq 2)s0>a%LԺ7Jyt:U Y\2{sD{:tl{^1 :!?>؇Փ^ O4BJL$E#%iKc3KyH *E;YlkmF/eOۻ|^I${~ŖlR{V[q^-67b >ۤXP,J61锓 ]lq`L<¹ YHbL\ڤ`ɌvQ8=Q=5=_Tg`"uP1w|">]m2.t6_Rs2]cI"Cu'sZu4:wKQS4ѱ'JV/.h񈂪tk#*ȣg7gWzoנg4u+Oq(4zBRK OćEbY])&|3ͼyzDZb4#%N<[8ŪE]|E!q)}By+[Wņ}hye- z?|2geǟ\8#Z7%'_F~jtAbvB:&5u1;yBhru|ÿ%z~zr?W'hkϒ;|t2w>'Nʌ/ROl|=#o~enetȖθ<RpC{ztKi wz$FK8D DGu!o떒EuiF<{z>HFJR;?beGpIX 3pMF2k\NLpJNp&H%XR[.*^@!s]J228F8S([|b}|k*e!h 6 i810A`A[UZQ+!(SHZs=Jw|dFJ˾)}P3כG=+tM\C\v#o i7&s=~b9{ jb{s_BKQ\R2s)JckU&vhGnY y2eYYc٘+Q(OAr|zD24td ᛪOE"djO =’1oPƼ1g8l̛I8!gc ^h)$cJ ֥R41+>(RY0mOhZkJXnff3?k-˩cQp/k+)1.9ϕѷJyD )E9, ,ޗ? QKƳsZ폍/E 3{oFU|(CQ*#!gG!dB' lUJ^[ ti% q􏲞Z?NEGyV_?OmIqxM:R6b?wgi奟'$~֜._\B""y Ra2{$$'Gtָem(>۽WsƱ-e )XQ96z,Nʰ`F!9kt}k< D2s uhXzbD8c>@8[U!TJ3eTvWS]Zt}5EE5EaƊr#FaA=MjvQX\HD&8Pܿ>s()USwv tAgLmJ1f*KkJGL7{.GNsg) pP|lKpR0vqR) .FbHDokP5BI__6]nD;4eaR+[(MCVNXK(%Z@bKrhm%A O.Ylyo砂 =[j]::pªZ3齫7SwMMzW  ~EսW"T)tצ~to^^'qXw(l3ĒOO&Nl(}r#POg<o>OH#c@҆8lc|L:H\G"eDchdLj^1ٟ BtF\MDm¬5}~9AgӺƉ] TjRd%seW"++0s"$ǃ9=K`sBBj@o ^f#Ӯl'(EdV`AY xo`ALRy2;yі)ieXW.<`+5-w.kF^.roL{%%Y\X kO[3ѼbΉfKxӉrPˤ7'Nτ+xo JR^R*Gd $t&#b$ #8,6`nwފdނʘ(z$;&;f?,0c4mlK۸P vL (\RlIcb)fɖeAI Ϋ U;gSt]T}Vj+D Rvn@IkT̀bMwI44×Z2 ],pdݮA לigK%{QP]\Km'Hdm"b:Z d=߁"# PIl c#uIS8jLkߑM,R|wnנŐb#|كYq\?aIʀvzOVTqG(?"zX.>v]*t{̣iH!<[rw/S;Pzt‚`d]"^x=/xh'wtrhM5=NQYINzi-u)+1->dZJiKg=t2=qkq޽PoW{L<'@n}矰B)| 6ɷz?`o)݋oƍRXhrQٝ:[ejmqrqjL l-?,uWToΓ+7)Ƚ^_H|p'ӻ<޲ %kuS2GO2<=:M'x"5#/߽eFY Z\V/薠k$Qi)3v TAJN::D]hD̬<$>kS9k3h[6<_\l([PX|s%mCgQ}'O 9*J?'u78u!5r˰^&e7Y5r1I>[.dRpu8=z餹<9>.b-4)ˑ!֑L "̣ qhWCS\Wm% ءZޖ[ s!MR@hX|69l5Z|y{3mh\C̫/]7LKmvM\/kk/Qk%VC"U wVo #0VF9DVXy¬6t8TX\oMm`iw-Ni{臁O~e?s۪;z: a@US$qGH>NfByg* g>݂m s݂^|mP|z}-b;5xIĢ癧ǘB,g&w''U>&Qc_\8"J+,!=p)|pd2w.8t&ћ,3jobv e0 5T >P>%YڹQ҅&b<0A t;"D6b]ǟHE*uP/a"Z-|𴷟\=P&cb<<ɦ#^1W2Y:XKe-bu"g'S>TH^Yb1b!%[~֊dY~>v[AGm5ђwGmxTM¢o`z OG}9,<.3w|t~/D~0޳67rWXRhUr&ۗ/v`VRH{W50̐q%!T..x1WUc |ߖ:*FA]Qc=a5H9G?gyJk*qF7iu(0z 偒!NiQG@?eUHȧE4Zَך=kF?F_{18_8mSފX X򵣠cWB^e.C-X-r2DKpCbV # `8\zvR} ;Jl>e"k[sq{Pв&6YI7F_̣򬔽JH]SVgyP 7O?_L7>>>Oq6o""*Zג:[pj&t0Nt'sU{QYR /.3xrC^vuZzqcLCr|-5&9دt<5J+94YJϼf!ޅ#S$O9/ n4o`:m3gZ% ,uiqQ8@,ժ6p('RD.0 9j93^{A (`e8WRi Di 4A -QF ^36L3TqVxƼ+SS %SUAX)ެk`F'Ӳ`4AvD-fN  UtpJxıc`A +@ofIsXQ]EDC^*ϱ`8J ;̞o$%]7=Ӿd95O/EVSϣRkg?=lxiu-RH ҒdJí8T=7zȾY ;OCw9F9;P߀Z21b'/OT3ˣv8)t,7g8~yx.;|Q"v%rvr>zjb^G /@_uY,|J>Ej}]m&m{*e湂J2RŬĖ^WmKkT ]!$ؒZW`y*`CE5,2NVD<@+e0pf UA@ CKB :4(t jj}wMsiŧ~]rN'j]wl΂3O:'Ms,sda'PdgZ 'Pl2:yoK&/Y /IgOtQsrFPxgoNe2M ,ȼuz[tv 52TI?sTPCkZQ~%OzC5lՄ}Cl~Q8 5 eMUȞ"'5ȠG1ӯ \=+x|nbwlR^`>Qo#qOvzM7?|* z8(I7lq oLb:„~~¿?-E0 Ele@p2 -B/fɋ.mM dRQj#P3!4j:R]d/7靈&ptwp\Cg#h5;>{:FF2"]z4+P{Hn23C)ySEcXf.VKKYń|aDB0#Ě\W$`cG>M4 zYbl9w4 =$%0zDqV8DI)Z.8|ȑ^JQά2!cu\y l #)>&`Q<5Ļ0۵iA2*2pqx *vƲEG[v"> Ef35555sbrrs?T8jatFYliCVJXV .@)e)U;wJl E֜ǑA<{o\f] X]cnIIAu#Me#^c;(/..?cXCWxVjj ۟ܡM (&̆kO΍QDDv żx8~jz6{xwy T28^Y0xDuA-hY4WJwx}ɂz2OOPmo91md%|*^"!JarP[ eBb * -7&m]y-RMxA؛SJ*ǔd DH+BP(,sgPIO8v? #D5L!~?d7ϽvC葘to2BmEU DyX_OF5 7$ ?e=ݬ.8BEM̎65'2ߊn5Ꞿ%t|Wo_2 Vvm<26 (/2Gfg=h?V࣮ eϨ{ jix{܁wt &_obFm6< F36jh0;8)DsB]:3Ne5òEP CH9i] byx<؅} ڐHU'A:=T]3-Skc P=%Lm/8 2ir_MsLԎw>*D6V* ќZ鮨q2b5tDshO'qPcj0$sDWspa=%ÙFJ}2!wۡ77mSl;m?Tֺ8ljЙ#,iqm@ \d>i@ rd8ȕZtH`TE@CF21"P|g:D[W!7 U P^w,uHX][$, CulSE.̙W6( )QQTcEear&!Ya6P`1"D { @5hcXc]8lưHJ1U[/qtG|\wօRIay 8%Rɼp!h]BgߴgR^we ã}1Xx|9'v.0hN{/wwpc4p-X`ڟodExJT=b2?Ktmn<뫋_%2k\n88K"$^,$î㫴\6CuVE-Ѷpw?k]O.t}]$=k!{K9xa1Ԉg_(jȡcOKozaOC9Uk0@ eՐj3%w/ Fl 8:*rD`F .x!iV%noȩ0(\荨+QlMHyވaL)"IahzP䜉ވs:!1ݲ:IEވ л}꺙@،STz~6B!!Y~mmF `o" Vo]Qcw]|m@9[U> UʸcAh-;U)skFfET- l9ޣU߫O߫JΆPZOx(pj(Bf` j4?0 ޯjQv5{Lq62!lJ 1FV}Sdb?ܬ|#b׊C>e?T{dFaӂS} ފ+ç3r=~Y~OdtzeZ̹zl(%M|6zlEMvH&qI|/`s3,1eʖ"Ġ e. Ý#%`ie( S79R֤pߞfHI|nP$9cJܕNܽP~2276,oPFq)~j]ӲRty ) 4Qo,eBC*VHe5O ١8?b77B9+ DHDI 4}\'7چ),i@͚44o됃WG}g@͊q=Q"_ΐЎcGlzK=@CcD9cv a})=6O/ۓez{és0ZIooFQDwG6;ɹiBF˄䦖x\ӹNuA?4e¬]|#SXA`I ?Vfן;)`5=:u N"n{Jn]%g҇Icк #];vPIXcPO;E{=S4|snbZ/SHզ]:qCTDTSh3nhzi*qj$TSXK5n5CGە۾_ZKmsU(J|w\yJ 9rj )5/)xrR[gqv [ǎS^ɍ/iA;Okm,%bgV..â*,0niUNTq9>N]B̏DRpo(9l)]#|7!VKgh7&"IZW?@=_tzCaKO/%H!u9o4 x7}zIڡ괦9 _5Whf/=G#!y^$ ֩n-j qO!MfRH ̑7bR 1mlPlpŇ@G&NMYs -tn W+\뒍,ir%j|Z1sW5mA8Uv2Q{E/t~Sv#"M˨r|q$l G{Z1vVvvp/,j5Zu -jH NNJbI~~Oܯ`  @:8UY |ފBc9 q~nsBh#4eu}ZC`A*ǠxH2(.Zu1k`#d:9ߋ9WFDžVo? :T-.8G2gz~lwSF.9쳞S{Z툓nJrUf)EgFB<(x +),>>Jc֒yx/˾!~?FƗy<( ufͺX]Wux0sнL{;atr̼ݳYBpEHG>.Ͻx tahHe{3⸓-@h6:re7ɻ8#&׵2O/w [?9)ӫ˓P3*gDtBk1)-X/P3U&/g˭;lKo ~Pwn#ǭq/\[6<k4 ˴Mjg\d )А ^1P3$ q[I%ThE^(rRصbv&GJFJH/IjhcйTG^ϓ7ꎲI6`-CCJ!pmQkϓrfF{Ւ,NBHi/zs$~u,7)j|}xѢe4 NKjC[.2s"ӅJz'z<4ֹ qK+TcHxXN .LeVAϐBd^GATJZ*nʂnJAJmٹGDp&.@`rN[]KZr{ö9[߮\MnMэ6~StjrK2z&DuM'&7b던h.VNa#pCK[ݟ;emt,?Gq[0J=Alο] 57l.ѥ"1)j:x]4.S’f `BLuO/ӻZF8W.C  &"PX4-7T@DS( hJ4Rm4z%fN4P E9JI-7"-A+do)+9iYEj#/ >XL%YZ0,fא@*`- W"y{u>G*ը4΃:E44=)zHf"G<뾓q>Y7ϑzJʴ23*nm T%.} M 1oHݎZ ?ѧgJ Nv+ɷI:56`6!IVP :b&q~b3FitY7/'ߜJ.5uM_99w'e4ohawh F}o*ފ>=n+9<=L(?\4V~Z%Z3j Muzo?k%('5>՘ԭxMK2p@i),k Z MKT+-Ghq9Z G-=d-$ܢهt>_5}үEKi ?'EM.5r`J#uC߶W8':1O+EͰ? Smc:q@E\ D dݱz1v 8i(Fǐj!DDd[KF2dFJL)+E:{b-ru >O~hy*93ײ5j-j)PKIV^K4{H,6{M7QZ䠶@_3to6 WSjj -rc]d 2g5;ByyfTF[r4Hַ˖!Z'@g? OZYgYJimY$EIICZ5jǔL:# %LoјވEޡu"ޗs!Ern C(UҖoC<Rx\5o@ZcӋY C@a2MC)ᨙеz {,*Io?le2. {T+a{ mAi)ViwXI 3\:l R0y2iNruƸ ^$]s̶|P&0+;OJB dbH1դuFȈA8#&[39-yd@ 3Bja}"ɈfͷS)"t+R,k^YEuA,pdK>*矋<&C.Y/Jo0(p(EՁ3e2̀KU'0z/PI!cm1PRrv''QG(ʒP;@,($E9oJ'6_/14,6b@ybq,Z9ud^Kyc@%GF tz\#:pFJaqv#2ݦ3j(AY:B/@9oS| ][f clY@i&^׊5F]DcNYuxZ#6`HQ#fQ#{q = f?~J#"2ic$˜0i޿bj: U@uw K+WmPD.%T/>WcTRC>DHKT=czǢf3}XxCj5ioF[N{,>=;}@wrPb'minifj.)M6%8ﯾ'd=D:~6(J@5N 蠅Ȟۺ/qS֖8zֵ(HJVe*ـ!ChBqōh$X ( }~قJm'k_-h3fkNm"Fi 5\@'-lNH=ǻ-2v ޑO/ hq`_{dln+p9mR)<십;e7a8M>ml@80T 8_vEn$XetV .婶ʻǤדu՛Ohz(& (ƚæMd|EZ$uQ"wum@}L?_0`(P }YaH"KvEЊRlɭ_8UOGg^2뱛S6|ڄV|CsUn 4+rw #JJa_ydi5jð!5faRu։Rj8Nu ̟KRRSJ J诡mP"9 rz28nSj*&tw籗zLS@x枷 [jбw\bnH@3원i3B}^(3!j>Ԓ"cA"F0bBݥe4<%ˊ>vx%23B3|nr,.3##E+v!9-06K2)yIsMΪ-D(ڢ (-"#&[|I ro:σbzG// 31C5;:j:j[_~}ws0}o`&\a.Z=Zh΢ qɢNsny"K|TFφtݩq Celb@%S+L?,;|Ƶi3P?P Aʮ0u[n zNp['}D[[ҹW?7e6~׺ꮚz9ˋ_gʹo'c<]/ *ej=ߡΞ}~]KSrsN_jen엪\=ѯY{\U!?8Dc0=RV7z-V!Љ6^븁[! bW76lRolP`[o\Q])1R_6NQ]&RRq)GRa(-@O4rǍR*PJEycJ_)5^icD) QzRj@2H>\z{J]͆#H[K+6g7mzUuFoݵcmf!-(]svcnW -wv'Vs>[ `T%4y=n߿<^_PƨyhXϾ)-R9a1/> j D,$x|hP9>x'㠔n)8Qgin'pqq0eC` H)M߷7-6Cd + i(%;&%nMq7d.KuX8FldNRdnQ128NOF\ o L)~b3"W0y4#;MMK_I 1]]Pcz9|>g37"aHeXVbTR1T(B1IV9#\K, 佛ϻ 䖉DW$DRzvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003652047215155650544017720 0ustar rootrootMar 16 00:06:30 crc systemd[1]: Starting Kubernetes Kubelet... Mar 16 00:06:30 crc restorecon[4697]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:30 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 16 00:06:31 crc restorecon[4697]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 16 00:06:31 crc kubenswrapper[4983]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.825207 4983 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837232 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837275 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837284 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837294 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837303 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837312 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837321 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837329 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837336 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837344 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837352 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837359 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837367 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837375 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837385 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837397 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837405 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837413 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837421 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837428 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837438 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837445 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837453 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837460 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837468 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837476 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837483 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837491 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837498 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837509 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837518 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837527 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837535 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837554 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837562 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837571 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837579 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837587 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837596 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837604 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837611 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837619 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837627 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837634 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837642 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837649 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837657 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837667 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837676 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837683 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837691 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837699 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837709 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837718 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837727 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837735 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837742 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837750 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837792 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837802 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837812 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837822 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837832 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837841 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837849 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837857 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837864 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837872 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837880 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837890 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.837903 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838118 4983 flags.go:64] FLAG: --address="0.0.0.0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838143 4983 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838169 4983 flags.go:64] FLAG: --anonymous-auth="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838181 4983 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838192 4983 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838201 4983 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838212 4983 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838223 4983 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838232 4983 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838241 4983 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838251 4983 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838282 4983 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838293 4983 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838302 4983 flags.go:64] FLAG: --cgroup-root="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838311 4983 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838321 4983 flags.go:64] FLAG: --client-ca-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838330 4983 flags.go:64] FLAG: --cloud-config="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838340 4983 flags.go:64] FLAG: --cloud-provider="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838349 4983 flags.go:64] FLAG: --cluster-dns="[]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838374 4983 flags.go:64] FLAG: --cluster-domain="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838383 4983 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838392 4983 flags.go:64] FLAG: --config-dir="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838401 4983 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838411 4983 flags.go:64] FLAG: --container-log-max-files="5" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838430 4983 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838439 4983 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838448 4983 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838459 4983 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838468 4983 flags.go:64] FLAG: --contention-profiling="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838477 4983 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838486 4983 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838497 4983 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838520 4983 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838531 4983 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838541 4983 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838550 4983 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838559 4983 flags.go:64] FLAG: --enable-load-reader="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838568 4983 flags.go:64] FLAG: --enable-server="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838578 4983 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838596 4983 flags.go:64] FLAG: --event-burst="100" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838606 4983 flags.go:64] FLAG: --event-qps="50" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838615 4983 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838624 4983 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838634 4983 flags.go:64] FLAG: --eviction-hard="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838645 4983 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838654 4983 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838663 4983 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838683 4983 flags.go:64] FLAG: --eviction-soft="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838693 4983 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838702 4983 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838711 4983 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838719 4983 flags.go:64] FLAG: --experimental-mounter-path="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838730 4983 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838742 4983 flags.go:64] FLAG: --fail-swap-on="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838791 4983 flags.go:64] FLAG: --feature-gates="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838808 4983 flags.go:64] FLAG: --file-check-frequency="20s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838820 4983 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838829 4983 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838838 4983 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838848 4983 flags.go:64] FLAG: --healthz-port="10248" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838857 4983 flags.go:64] FLAG: --help="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838866 4983 flags.go:64] FLAG: --hostname-override="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838875 4983 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838884 4983 flags.go:64] FLAG: --http-check-frequency="20s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838898 4983 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838906 4983 flags.go:64] FLAG: --image-credential-provider-config="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838915 4983 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838924 4983 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838932 4983 flags.go:64] FLAG: --image-service-endpoint="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838941 4983 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838949 4983 flags.go:64] FLAG: --kube-api-burst="100" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838960 4983 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838971 4983 flags.go:64] FLAG: --kube-api-qps="50" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838982 4983 flags.go:64] FLAG: --kube-reserved="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.838994 4983 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839004 4983 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839015 4983 flags.go:64] FLAG: --kubelet-cgroups="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839024 4983 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839032 4983 flags.go:64] FLAG: --lock-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839041 4983 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839050 4983 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839059 4983 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839073 4983 flags.go:64] FLAG: --log-json-split-stream="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839097 4983 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839106 4983 flags.go:64] FLAG: --log-text-split-stream="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839116 4983 flags.go:64] FLAG: --logging-format="text" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839124 4983 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839134 4983 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839142 4983 flags.go:64] FLAG: --manifest-url="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839151 4983 flags.go:64] FLAG: --manifest-url-header="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839163 4983 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839172 4983 flags.go:64] FLAG: --max-open-files="1000000" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839182 4983 flags.go:64] FLAG: --max-pods="110" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839191 4983 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839201 4983 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839210 4983 flags.go:64] FLAG: --memory-manager-policy="None" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839222 4983 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839231 4983 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839241 4983 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839249 4983 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839269 4983 flags.go:64] FLAG: --node-status-max-images="50" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839278 4983 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839286 4983 flags.go:64] FLAG: --oom-score-adj="-999" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839296 4983 flags.go:64] FLAG: --pod-cidr="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839305 4983 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839317 4983 flags.go:64] FLAG: --pod-manifest-path="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839326 4983 flags.go:64] FLAG: --pod-max-pids="-1" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839335 4983 flags.go:64] FLAG: --pods-per-core="0" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839344 4983 flags.go:64] FLAG: --port="10250" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839353 4983 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839362 4983 flags.go:64] FLAG: --provider-id="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839371 4983 flags.go:64] FLAG: --qos-reserved="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839380 4983 flags.go:64] FLAG: --read-only-port="10255" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839389 4983 flags.go:64] FLAG: --register-node="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839398 4983 flags.go:64] FLAG: --register-schedulable="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839407 4983 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839421 4983 flags.go:64] FLAG: --registry-burst="10" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839430 4983 flags.go:64] FLAG: --registry-qps="5" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839439 4983 flags.go:64] FLAG: --reserved-cpus="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839460 4983 flags.go:64] FLAG: --reserved-memory="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839471 4983 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839482 4983 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839491 4983 flags.go:64] FLAG: --rotate-certificates="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839500 4983 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839509 4983 flags.go:64] FLAG: --runonce="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839518 4983 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839528 4983 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839537 4983 flags.go:64] FLAG: --seccomp-default="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839550 4983 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839559 4983 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839569 4983 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839578 4983 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839588 4983 flags.go:64] FLAG: --storage-driver-password="root" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839596 4983 flags.go:64] FLAG: --storage-driver-secure="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839605 4983 flags.go:64] FLAG: --storage-driver-table="stats" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839613 4983 flags.go:64] FLAG: --storage-driver-user="root" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839623 4983 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839631 4983 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839640 4983 flags.go:64] FLAG: --system-cgroups="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839649 4983 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839662 4983 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839672 4983 flags.go:64] FLAG: --tls-cert-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839681 4983 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839697 4983 flags.go:64] FLAG: --tls-min-version="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839706 4983 flags.go:64] FLAG: --tls-private-key-file="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839714 4983 flags.go:64] FLAG: --topology-manager-policy="none" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839723 4983 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839733 4983 flags.go:64] FLAG: --topology-manager-scope="container" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839742 4983 flags.go:64] FLAG: --v="2" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839780 4983 flags.go:64] FLAG: --version="false" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839792 4983 flags.go:64] FLAG: --vmodule="" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839803 4983 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.839813 4983 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840051 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840064 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840084 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840092 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840101 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840115 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840123 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840133 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840141 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840149 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840157 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840165 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840172 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840180 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840188 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840196 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840204 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840212 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840220 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840227 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840238 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840248 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840257 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840266 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840274 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840282 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840290 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840298 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840305 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840313 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840321 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840328 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840336 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840343 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840352 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840360 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840367 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840377 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840396 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840407 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840415 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840423 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840431 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840439 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840446 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840456 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840465 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840473 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840481 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840490 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840497 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840505 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840513 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840523 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840533 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840542 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840552 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840560 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840569 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840577 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840585 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840593 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840602 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840610 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840617 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840625 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840633 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840640 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840648 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840661 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.840670 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.840697 4983 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.854562 4983 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.854620 4983 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854747 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854806 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854821 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854838 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854852 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854863 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854874 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854884 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854893 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854903 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854915 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854926 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854938 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854947 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854959 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854969 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854979 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854988 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.854996 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855006 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855015 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855023 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855032 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855042 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855051 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855059 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855068 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855078 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855087 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855099 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855108 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855117 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855127 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855136 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855145 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855154 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855162 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855171 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855179 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855187 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855196 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855204 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855212 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855220 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855228 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855237 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855246 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855254 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855263 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855271 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855280 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855288 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855297 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855305 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855314 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855322 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855330 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855342 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855352 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855365 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855376 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855387 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855397 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855406 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855415 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855424 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855433 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855442 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855452 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855464 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855473 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.855488 4983 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855746 4983 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855802 4983 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855814 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855824 4983 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855833 4983 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855841 4983 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855852 4983 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855861 4983 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855870 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855879 4983 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855888 4983 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855896 4983 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855905 4983 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855913 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855922 4983 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855930 4983 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855938 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855946 4983 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855955 4983 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855967 4983 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855978 4983 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855988 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.855998 4983 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856008 4983 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856018 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856027 4983 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856036 4983 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856044 4983 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856052 4983 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856062 4983 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856072 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856080 4983 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856088 4983 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856096 4983 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856104 4983 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856113 4983 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856121 4983 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856129 4983 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856140 4983 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856151 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856160 4983 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856170 4983 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856179 4983 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856187 4983 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856195 4983 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856203 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856212 4983 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856220 4983 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856228 4983 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856237 4983 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856246 4983 feature_gate.go:330] unrecognized feature gate: Example Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856254 4983 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856264 4983 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856272 4983 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856280 4983 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856288 4983 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856298 4983 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856310 4983 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856324 4983 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856334 4983 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856344 4983 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856355 4983 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856363 4983 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856372 4983 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856380 4983 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856390 4983 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856399 4983 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856408 4983 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856416 4983 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856424 4983 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 16 00:06:31 crc kubenswrapper[4983]: W0316 00:06:31.856432 4983 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.856447 4983 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.856731 4983 server.go:940] "Client rotation is on, will bootstrap in background" Mar 16 00:06:31 crc kubenswrapper[4983]: E0316 00:06:31.862527 4983 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.868372 4983 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.868538 4983 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.870616 4983 server.go:997] "Starting client certificate rotation" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.870681 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.870930 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.900460 4983 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:06:31 crc kubenswrapper[4983]: E0316 00:06:31.904501 4983 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.904560 4983 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.921628 4983 log.go:25] "Validated CRI v1 runtime API" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.955170 4983 log.go:25] "Validated CRI v1 image API" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.960084 4983 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.964823 4983 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-16-00-00-46-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.964870 4983 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.991220 4983 manager.go:217] Machine: {Timestamp:2026-03-16 00:06:31.987917689 +0000 UTC m=+0.588016189 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2ead470a-f520-44aa-9efc-f4170c7efbf2 BootID:07bf7a14-97e0-4c5e-b357-db0b2f7bca2e Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:eb:47:27 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:eb:47:27 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:76:fd:9d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:dd:62:62 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:54:44:df Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:7a:50:f1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:72:9a:cb:14:3c:2c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:ce:d3:45:61:cc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.991592 4983 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.991818 4983 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.994608 4983 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.994948 4983 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.995007 4983 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.995303 4983 topology_manager.go:138] "Creating topology manager with none policy" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.995320 4983 container_manager_linux.go:303] "Creating device plugin manager" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.996514 4983 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.996562 4983 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.996907 4983 state_mem.go:36] "Initialized new in-memory state store" Mar 16 00:06:31 crc kubenswrapper[4983]: I0316 00:06:31.997043 4983 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.001923 4983 kubelet.go:418] "Attempting to sync node with API server" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.001968 4983 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.002007 4983 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.002027 4983 kubelet.go:324] "Adding apiserver pod source" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.002045 4983 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.005809 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.005915 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.005846 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.006254 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.007002 4983 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.008130 4983 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.012562 4983 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014076 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014157 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014211 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014274 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014328 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014385 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014433 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014484 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014533 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014581 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014636 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.014694 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.016930 4983 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.017996 4983 server.go:1280] "Started kubelet" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.022907 4983 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.022599 4983 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 16 00:06:32 crc systemd[1]: Started Kubernetes Kubelet. Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.024406 4983 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.025521 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.028055 4983 server.go:460] "Adding debug handlers to kubelet server" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.033516 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.033592 4983 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034066 4983 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034089 4983 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034142 4983 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.034553 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.034622 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.034931 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.034964 4983 factory.go:55] Registering systemd factory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035286 4983 factory.go:221] Registration of the systemd container factory successfully Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035842 4983 factory.go:153] Registering CRI-O factory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035880 4983 factory.go:221] Registration of the crio container factory successfully Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035960 4983 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.035997 4983 factory.go:103] Registering Raw factory Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.036016 4983 manager.go:1196] Started watching for new ooms in manager Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.036828 4983 manager.go:319] Starting recovery of all containers Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.034044 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.036870 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041866 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041931 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041952 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041972 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.041984 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042002 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042016 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042029 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042044 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042057 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042074 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042086 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042100 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042115 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042210 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042231 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042248 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042262 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042275 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.042316 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.045708 4983 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.045907 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046043 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046127 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046211 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046294 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046385 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046514 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046640 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046749 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046874 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.046958 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047043 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047137 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047219 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047440 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047559 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047652 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047732 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047843 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.047937 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048017 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048322 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048448 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048608 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048713 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048826 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.048928 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049011 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049092 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049200 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049296 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049401 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049500 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049605 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049711 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049843 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.049935 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050025 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050109 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050188 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050275 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050379 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050487 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050583 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050688 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050808 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.050905 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051001 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051083 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051168 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051250 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051350 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051455 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051542 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051629 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051710 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051851 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.051951 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052096 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052182 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052262 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052348 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052428 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052510 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052590 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052672 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052805 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.052905 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053041 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053160 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053249 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053328 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053426 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053521 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053654 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053803 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053890 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.053999 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054288 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054388 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054478 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.054561 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.055286 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056456 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056578 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056605 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056639 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056669 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056688 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056718 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056745 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056788 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056806 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056825 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056844 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.056860 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057362 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057388 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057416 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057435 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057462 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057487 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057512 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057542 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057564 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057587 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057614 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057636 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057668 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057692 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057711 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057732 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057769 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057793 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057813 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057832 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057861 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057887 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057914 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057940 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057966 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.057993 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058013 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058038 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058056 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058082 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058117 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058136 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058156 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058176 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058193 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058214 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058230 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058250 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058274 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058290 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058308 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058325 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058338 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058356 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058368 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058390 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058406 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058426 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058489 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058506 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058529 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058547 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058561 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058588 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058601 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058615 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058632 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058646 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058667 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058684 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058698 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058714 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058729 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058765 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058784 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058798 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058814 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058826 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058843 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058856 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058868 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058885 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058897 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058919 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058937 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058953 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058970 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.058982 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059006 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059018 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059033 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059050 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059062 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059078 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059092 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059105 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059121 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059139 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059150 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059166 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059180 4983 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059195 4983 reconstruct.go:97] "Volume reconstruction finished" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.059204 4983 reconciler.go:26] "Reconciler: start to sync state" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.064710 4983 manager.go:324] Recovery completed Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.079087 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.080835 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.080883 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.080895 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.082339 4983 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.082657 4983 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.082783 4983 state_mem.go:36] "Initialized new in-memory state store" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.089629 4983 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.091253 4983 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.091304 4983 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.091338 4983 kubelet.go:2335] "Starting kubelet main sync loop" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.091525 4983 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.093879 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.093998 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.101465 4983 policy_none.go:49] "None policy: Start" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.102645 4983 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.102701 4983 state_mem.go:35] "Initializing new in-memory state store" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.135534 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.156998 4983 manager.go:334] "Starting Device Plugin manager" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.157114 4983 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.157138 4983 server.go:79] "Starting device plugin registration server" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158007 4983 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158040 4983 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158328 4983 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158469 4983 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.158492 4983 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.167128 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.192519 4983 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.192701 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194695 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194766 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.194962 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.195133 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.195186 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.195996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196035 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196044 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196133 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196545 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.196569 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197007 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197181 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197208 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197330 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197359 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197391 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197440 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197471 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.197922 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198084 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198570 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198588 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198505 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198831 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198943 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.198971 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200623 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200632 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.200998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201051 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201063 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201343 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.201405 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.202447 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.202472 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.202481 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.239877 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.258816 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260153 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260179 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.260235 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.261086 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261216 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261316 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261374 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261413 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261487 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261586 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261682 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261729 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261834 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261863 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261882 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261926 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.261954 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363414 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363478 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363520 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363535 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363561 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363645 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363660 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363691 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363733 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.363550 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363737 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363881 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363896 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363956 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363978 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363936 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364053 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363941 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363932 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363896 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.363864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364271 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364310 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364551 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364618 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364729 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364830 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.364910 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.461623 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464081 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464098 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.464139 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.464815 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.518302 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.525343 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.545185 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.569221 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564 WatchSource:0}: Error finding container 38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564: Status 404 returned error can't find the container with id 38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564 Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.569516 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.572959 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9 WatchSource:0}: Error finding container 892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9: Status 404 returned error can't find the container with id 892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9 Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.579923 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.585243 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009 WatchSource:0}: Error finding container 8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009: Status 404 returned error can't find the container with id 8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009 Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.593231 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129 WatchSource:0}: Error finding container e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129: Status 404 returned error can't find the container with id e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129 Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.613942 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf WatchSource:0}: Error finding container 266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf: Status 404 returned error can't find the container with id 266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.641902 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 16 00:06:32 crc kubenswrapper[4983]: W0316 00:06:32.843361 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.843504 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.865813 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867513 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867596 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867636 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:32 crc kubenswrapper[4983]: I0316 00:06:32.867689 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:32 crc kubenswrapper[4983]: E0316 00:06:32.868523 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.027187 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.095735 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"38b9c4b82dacfca28c66a23c0c39b80afdf00c14c9cdc7cae09e7662b0f01564"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.096806 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"266776467af6a2a96a278c6ceb97290c41905a1999a499481d0b3226b5671daf"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.097986 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e5895ae1ce3134ff6b7749cde7865a85f99287ea2067518bf4fe851c7db5b129"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.099063 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cad439d2f13373ddf0a9f9dafc0ab855b098fa917e6a8b1d9bd3fd177c03009"} Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.101206 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"892b62111794c7ab545a80c0afb380ed7cc2f821a9903a1312a809a47a88e8d9"} Mar 16 00:06:33 crc kubenswrapper[4983]: W0316 00:06:33.323103 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.323516 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:33 crc kubenswrapper[4983]: W0316 00:06:33.413051 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.413141 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:33 crc kubenswrapper[4983]: W0316 00:06:33.424386 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.424598 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.442921 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.669486 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672519 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672587 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672602 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:33 crc kubenswrapper[4983]: I0316 00:06:33.672651 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:33 crc kubenswrapper[4983]: E0316 00:06:33.673337 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.026845 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.089743 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:34 crc kubenswrapper[4983]: E0316 00:06:34.091243 4983 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.223:6443: connect: connection refused" logger="UnhandledError" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.105321 4983 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.105463 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.105487 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.106470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.106495 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.106506 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108718 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108746 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108779 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108791 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.108803 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.110888 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.110944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.110957 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.111608 4983 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.111678 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.111737 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112870 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112909 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112923 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112931 4983 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.112985 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.113247 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114111 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719" exitCode=0 Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114130 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719"} Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114162 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114183 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114203 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114942 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.114975 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.116596 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.118062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.118104 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:34 crc kubenswrapper[4983]: I0316 00:06:34.118116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.026482 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.223:6443: connect: connection refused Mar 16 00:06:35 crc kubenswrapper[4983]: E0316 00:06:35.044614 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124194 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124250 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124264 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.124276 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.126120 4983 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230" exitCode=0 Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.126209 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.126392 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.127334 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.127373 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.127386 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.130089 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.130130 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.131117 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.131145 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.131156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.138833 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.138905 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.138789 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139343 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139382 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712"} Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139907 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.139920 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.140464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.140540 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.140556 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.274260 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276309 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276374 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276388 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.276424 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:35 crc kubenswrapper[4983]: E0316 00:06:35.277153 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.223:6443: connect: connection refused" node="crc" Mar 16 00:06:35 crc kubenswrapper[4983]: I0316 00:06:35.906821 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.151654 4983 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236" exitCode=0 Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.151836 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236"} Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.152097 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.153528 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.153588 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.153607 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159586 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed"} Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159623 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159720 4983 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159809 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159729 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.159828 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.162151 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.162207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.162233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.163845 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.163896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.163916 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.164874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.164913 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.164931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.165169 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.165231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.165252 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:36 crc kubenswrapper[4983]: I0316 00:06:36.300113 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.014958 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170191 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180"} Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170261 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31"} Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170281 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8"} Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170312 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.170389 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172079 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172128 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172175 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172127 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172245 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.172287 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.312594 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.312871 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.314474 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.314549 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:37 crc kubenswrapper[4983]: I0316 00:06:37.314575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.170694 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179248 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961"} Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179384 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32"} Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179318 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.179305 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181171 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181214 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181481 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.181508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.345991 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.477920 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480140 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480211 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480236 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.480280 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:38 crc kubenswrapper[4983]: I0316 00:06:38.676651 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.181677 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.181814 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.182857 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.182908 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.182925 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.183247 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.183268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:39 crc kubenswrapper[4983]: I0316 00:06:39.183278 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.184442 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.185382 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.185458 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.185477 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.786052 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.786258 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.787521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.787600 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:40 crc kubenswrapper[4983]: I0316 00:06:40.787622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:42 crc kubenswrapper[4983]: E0316 00:06:42.167273 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.357323 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.357850 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.360103 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.360166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:42 crc kubenswrapper[4983]: I0316 00:06:42.360188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.708099 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.708344 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.710115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.710168 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.710187 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:43 crc kubenswrapper[4983]: I0316 00:06:43.715750 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.195322 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.196902 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.196955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.196970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:44 crc kubenswrapper[4983]: I0316 00:06:44.200849 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.197812 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.199058 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.199118 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.199138 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.358257 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.358331 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:06:45 crc kubenswrapper[4983]: W0316 00:06:45.561398 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.561504 4983 trace.go:236] Trace[878499469]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:35.559) (total time: 10001ms): Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[878499469]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:45.561) Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[878499469]: [10.001558097s] [10.001558097s] END Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.561530 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:06:45 crc kubenswrapper[4983]: W0316 00:06:45.580186 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.580326 4983 trace.go:236] Trace[577567899]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:35.578) (total time: 10001ms): Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[577567899]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:45.580) Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[577567899]: [10.001678111s] [10.001678111s] END Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.580362 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:06:45 crc kubenswrapper[4983]: W0316 00:06:45.948169 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.948287 4983 trace.go:236] Trace[254954146]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (16-Mar-2026 00:06:35.946) (total time: 10001ms): Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[254954146]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:06:45.948) Mar 16 00:06:45 crc kubenswrapper[4983]: Trace[254954146]: [10.001825475s] [10.001825475s] END Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.948316 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 16 00:06:45 crc kubenswrapper[4983]: E0316 00:06:45.992016 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:45Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.996444 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:45Z is after 2026-02-23T05:33:13Z Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.998059 4983 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:45 crc kubenswrapper[4983]: I0316 00:06:45.998136 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.000981 4983 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:46 crc kubenswrapper[4983]: W0316 00:06:46.003336 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.003424 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.004132 4983 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.004180 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.009106 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" node="crc" Mar 16 00:06:46 crc kubenswrapper[4983]: E0316 00:06:46.011211 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.029867 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:46Z is after 2026-02-23T05:33:13Z Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.202129 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.203407 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed" exitCode=255 Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.203454 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed"} Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.203618 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204425 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204433 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.204959 4983 scope.go:117] "RemoveContainer" containerID="6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed" Mar 16 00:06:46 crc kubenswrapper[4983]: I0316 00:06:46.906854 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.031289 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:47Z is after 2026-02-23T05:33:13Z Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.049737 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.049968 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.051619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.051673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.051689 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.109943 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.208098 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.209431 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.209856 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0"} Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.210029 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.211110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.211328 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.211347 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.213231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.213273 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.213288 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:47 crc kubenswrapper[4983]: I0316 00:06:47.226253 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.031610 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:48Z is after 2026-02-23T05:33:13Z Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.214132 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.215571 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.216727 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" exitCode=255 Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.216913 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.217337 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0"} Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.217378 4983 scope.go:117] "RemoveContainer" containerID="6dda0f9e5f13f6251926769a0d785e383946b7af1a7bab692e1c018a88e171ed" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.217892 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.218005 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.218110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.218563 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.219362 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.219398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.219412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:48 crc kubenswrapper[4983]: I0316 00:06:48.220029 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:48 crc kubenswrapper[4983]: E0316 00:06:48.220223 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.032174 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.222135 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.224750 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.225944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.226176 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.226363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:49 crc kubenswrapper[4983]: I0316 00:06:49.227530 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:49 crc kubenswrapper[4983]: E0316 00:06:49.228028 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:49 crc kubenswrapper[4983]: W0316 00:06:49.569890 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z Mar 16 00:06:49 crc kubenswrapper[4983]: E0316 00:06:49.570006 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:49 crc kubenswrapper[4983]: W0316 00:06:49.904328 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z Mar 16 00:06:49 crc kubenswrapper[4983]: E0316 00:06:49.904427 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.031604 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:50Z is after 2026-02-23T05:33:13Z Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.790603 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.790790 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792087 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792140 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.792848 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:50 crc kubenswrapper[4983]: E0316 00:06:50.793027 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:50 crc kubenswrapper[4983]: I0316 00:06:50.799986 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.031559 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:51Z is after 2026-02-23T05:33:13Z Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.229614 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.230691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.230721 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.230731 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:51 crc kubenswrapper[4983]: I0316 00:06:51.231250 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:51 crc kubenswrapper[4983]: E0316 00:06:51.231404 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:51 crc kubenswrapper[4983]: W0316 00:06:51.691316 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:51Z is after 2026-02-23T05:33:13Z Mar 16 00:06:51 crc kubenswrapper[4983]: E0316 00:06:51.691426 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:06:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.033578 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:52 crc kubenswrapper[4983]: W0316 00:06:52.078042 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.078129 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.167437 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.409401 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410747 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:52 crc kubenswrapper[4983]: I0316 00:06:52.410796 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.415537 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:06:52 crc kubenswrapper[4983]: E0316 00:06:52.415884 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:06:53 crc kubenswrapper[4983]: I0316 00:06:53.035084 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:54 crc kubenswrapper[4983]: I0316 00:06:54.030438 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:54 crc kubenswrapper[4983]: I0316 00:06:54.363669 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 16 00:06:54 crc kubenswrapper[4983]: I0316 00:06:54.386516 4983 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:06:55 crc kubenswrapper[4983]: I0316 00:06:55.033655 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:55 crc kubenswrapper[4983]: I0316 00:06:55.359515 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:06:55 crc kubenswrapper[4983]: I0316 00:06:55.359630 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.003575 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f34e6fcf6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,LastTimestamp:2026-03-16 00:06:32.017935606 +0000 UTC m=+0.618034086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.010813 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.017691 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.024981 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.031597 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.032081 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f3d6059ef default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.160106991 +0000 UTC m=+0.760205441,LastTimestamp:2026-03-16 00:06:32.160106991 +0000 UTC m=+0.760205441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.036173 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.194719643 +0000 UTC m=+0.794818073,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.039842 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.194742375 +0000 UTC m=+0.794840805,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.045947 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.194772117 +0000 UTC m=+0.794870547,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.052358 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.196029329 +0000 UTC m=+0.796127759,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.058960 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.19604111 +0000 UTC m=+0.796139540,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.066968 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.196049971 +0000 UTC m=+0.796148401,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.074292 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.197179585 +0000 UTC m=+0.797278065,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.077544 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.197201526 +0000 UTC m=+0.797299956,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.081051 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.197215647 +0000 UTC m=+0.797314077,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.087540 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.197246259 +0000 UTC m=+0.797344689,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.089438 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.197360637 +0000 UTC m=+0.797459087,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.096361 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.197384388 +0000 UTC m=+0.797482838,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.104314 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.197400279 +0000 UTC m=+0.797498719,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.111310 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.198047742 +0000 UTC m=+0.798146212,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.118520 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.198220053 +0000 UTC m=+0.798318523,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.125340 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.198437537 +0000 UTC m=+0.798536017,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.132008 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.198495981 +0000 UTC m=+0.798594451,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.138506 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a73880\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a73880 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080865408 +0000 UTC m=+0.680963848,LastTimestamp:2026-03-16 00:06:32.198519163 +0000 UTC m=+0.798617603,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.144943 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a79c78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a79c78 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080891 +0000 UTC m=+0.680989440,LastTimestamp:2026-03-16 00:06:32.198581967 +0000 UTC m=+0.798680407,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.151620 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189d299f38a7cb6d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189d299f38a7cb6d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.080903021 +0000 UTC m=+0.681001471,LastTimestamp:2026-03-16 00:06:32.198596568 +0000 UTC m=+0.798695008,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.160105 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299f561dbc08 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.575171592 +0000 UTC m=+1.175270052,LastTimestamp:2026-03-16 00:06:32.575171592 +0000 UTC m=+1.175270052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.167085 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299f56606be5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.579541989 +0000 UTC m=+1.179640459,LastTimestamp:2026-03-16 00:06:32.579541989 +0000 UTC m=+1.179640459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.173607 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f571baf0f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.591814415 +0000 UTC m=+1.191912885,LastTimestamp:2026-03-16 00:06:32.591814415 +0000 UTC m=+1.191912885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.180851 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299f5768971c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.596854556 +0000 UTC m=+1.196953016,LastTimestamp:2026-03-16 00:06:32.596854556 +0000 UTC m=+1.196953016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.185006 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299f5945840c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:32.628110348 +0000 UTC m=+1.228208818,LastTimestamp:2026-03-16 00:06:32.628110348 +0000 UTC m=+1.228208818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.187995 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7ad72767 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.191303015 +0000 UTC m=+1.791401465,LastTimestamp:2026-03-16 00:06:33.191303015 +0000 UTC m=+1.791401465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.192797 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299f7af1a144 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.193038148 +0000 UTC m=+1.793136588,LastTimestamp:2026-03-16 00:06:33.193038148 +0000 UTC m=+1.793136588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.195479 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299f7b5ea7f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.200183288 +0000 UTC m=+1.800281718,LastTimestamp:2026-03-16 00:06:33.200183288 +0000 UTC m=+1.800281718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.200108 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7b6ee117 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.201246487 +0000 UTC m=+1.801344917,LastTimestamp:2026-03-16 00:06:33.201246487 +0000 UTC m=+1.801344917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.202395 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299f7b7c8c08 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.202142216 +0000 UTC m=+1.802240656,LastTimestamp:2026-03-16 00:06:33.202142216 +0000 UTC m=+1.802240656,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.209178 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7b92600e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.20357275 +0000 UTC m=+1.803671180,LastTimestamp:2026-03-16 00:06:33.20357275 +0000 UTC m=+1.803671180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.216512 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299f7b9ce56f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.204262255 +0000 UTC m=+1.804360685,LastTimestamp:2026-03-16 00:06:33.204262255 +0000 UTC m=+1.804360685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.223077 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299f7c503e91 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.216016017 +0000 UTC m=+1.816114457,LastTimestamp:2026-03-16 00:06:33.216016017 +0000 UTC m=+1.816114457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.229673 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299f7cf60644 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.22688058 +0000 UTC m=+1.826979010,LastTimestamp:2026-03-16 00:06:33.22688058 +0000 UTC m=+1.826979010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.236498 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299f7d1db429 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.229481001 +0000 UTC m=+1.829579441,LastTimestamp:2026-03-16 00:06:33.229481001 +0000 UTC m=+1.829579441,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.245060 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299f7d24a955 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.229936981 +0000 UTC m=+1.830035411,LastTimestamp:2026-03-16 00:06:33.229936981 +0000 UTC m=+1.830035411,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.252827 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8d2fae19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.499094553 +0000 UTC m=+2.099193013,LastTimestamp:2026-03-16 00:06:33.499094553 +0000 UTC m=+2.099193013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.259852 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8e097a06 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.51336807 +0000 UTC m=+2.113466530,LastTimestamp:2026-03-16 00:06:33.51336807 +0000 UTC m=+2.113466530,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.266038 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8e338e37 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.516125751 +0000 UTC m=+2.116224221,LastTimestamp:2026-03-16 00:06:33.516125751 +0000 UTC m=+2.116224221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.272981 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f9ab64ff6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.726021622 +0000 UTC m=+2.326120052,LastTimestamp:2026-03-16 00:06:33.726021622 +0000 UTC m=+2.326120052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.280804 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f9b2428c0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.733220544 +0000 UTC m=+2.333318974,LastTimestamp:2026-03-16 00:06:33.733220544 +0000 UTC m=+2.333318974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.288200 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f9b41e782 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.735169922 +0000 UTC m=+2.335268352,LastTimestamp:2026-03-16 00:06:33.735169922 +0000 UTC m=+2.335268352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.294994 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299fa7a2c5eb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.942844907 +0000 UTC m=+2.542943327,LastTimestamp:2026-03-16 00:06:33.942844907 +0000 UTC m=+2.542943327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.300457 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.300710 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.300795 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299fa838d249 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.952678473 +0000 UTC m=+2.552776913,LastTimestamp:2026-03-16 00:06:33.952678473 +0000 UTC m=+2.552776913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.306078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.306153 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.306177 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.307845 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.308136 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.313044 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fb177eb9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.107808666 +0000 UTC m=+2.707907096,LastTimestamp:2026-03-16 00:06:34.107808666 +0000 UTC m=+2.707907096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.320844 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299fb1e622e5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.115031781 +0000 UTC m=+2.715130241,LastTimestamp:2026-03-16 00:06:34.115031781 +0000 UTC m=+2.715130241,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.325500 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fb1fadfb5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.116390837 +0000 UTC m=+2.716489277,LastTimestamp:2026-03-16 00:06:34.116390837 +0000 UTC m=+2.716489277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.330414 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fb1fb26b9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.116409017 +0000 UTC m=+2.716507457,LastTimestamp:2026-03-16 00:06:34.116409017 +0000 UTC m=+2.716507457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.336034 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299fc04ac634 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.356508212 +0000 UTC m=+2.956606642,LastTimestamp:2026-03-16 00:06:34.356508212 +0000 UTC m=+2.956606642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.341508 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fc0949d12 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.361347346 +0000 UTC m=+2.961445776,LastTimestamp:2026-03-16 00:06:34.361347346 +0000 UTC m=+2.961445776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.347332 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fc0c53272 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.364531314 +0000 UTC m=+2.964629744,LastTimestamp:2026-03-16 00:06:34.364531314 +0000 UTC m=+2.964629744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.353234 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fc0efb564 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.367317348 +0000 UTC m=+2.967415778,LastTimestamp:2026-03-16 00:06:34.367317348 +0000 UTC m=+2.967415778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.358390 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189d299fc19885e5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.378380773 +0000 UTC m=+2.978479203,LastTimestamp:2026-03-16 00:06:34.378380773 +0000 UTC m=+2.978479203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.364885 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fc1eda1f9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.383958521 +0000 UTC m=+2.984056951,LastTimestamp:2026-03-16 00:06:34.383958521 +0000 UTC m=+2.984056951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.369661 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fc2003c35 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.385177653 +0000 UTC m=+2.985276083,LastTimestamp:2026-03-16 00:06:34.385177653 +0000 UTC m=+2.985276083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.373903 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fc2a427da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.395920346 +0000 UTC m=+2.996018776,LastTimestamp:2026-03-16 00:06:34.395920346 +0000 UTC m=+2.996018776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.378584 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fc2c091bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.397782459 +0000 UTC m=+2.997880889,LastTimestamp:2026-03-16 00:06:34.397782459 +0000 UTC m=+2.997880889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.384477 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fc335156c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.405418348 +0000 UTC m=+3.005516788,LastTimestamp:2026-03-16 00:06:34.405418348 +0000 UTC m=+3.005516788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.390077 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fce6b26ca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.593511114 +0000 UTC m=+3.193609544,LastTimestamp:2026-03-16 00:06:34.593511114 +0000 UTC m=+3.193609544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.395653 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fce71a0bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.593935548 +0000 UTC m=+3.194033988,LastTimestamp:2026-03-16 00:06:34.593935548 +0000 UTC m=+3.194033988,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.401285 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fcf16299d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.604718493 +0000 UTC m=+3.204816923,LastTimestamp:2026-03-16 00:06:34.604718493 +0000 UTC m=+3.204816923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.406227 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fcf24b915 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.605672725 +0000 UTC m=+3.205771155,LastTimestamp:2026-03-16 00:06:34.605672725 +0000 UTC m=+3.205771155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.411311 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fcf474432 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.607936562 +0000 UTC m=+3.208034992,LastTimestamp:2026-03-16 00:06:34.607936562 +0000 UTC m=+3.208034992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.416593 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fcf58d043 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.609086531 +0000 UTC m=+3.209184961,LastTimestamp:2026-03-16 00:06:34.609086531 +0000 UTC m=+3.209184961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.423050 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fdc11939a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.822521754 +0000 UTC m=+3.422620194,LastTimestamp:2026-03-16 00:06:34.822521754 +0000 UTC m=+3.422620194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.429777 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fdc1210b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.822553776 +0000 UTC m=+3.422652206,LastTimestamp:2026-03-16 00:06:34.822553776 +0000 UTC m=+3.422652206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.435327 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189d299fdcb3a428 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.833142824 +0000 UTC m=+3.433241254,LastTimestamp:2026-03-16 00:06:34.833142824 +0000 UTC m=+3.433241254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.442079 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fdd0916ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.838742714 +0000 UTC m=+3.438841144,LastTimestamp:2026-03-16 00:06:34.838742714 +0000 UTC m=+3.438841144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.447098 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fdd1d19b6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:34.840054198 +0000 UTC m=+3.440152628,LastTimestamp:2026-03-16 00:06:34.840054198 +0000 UTC m=+3.440152628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.452147 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe70bdbf6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.006696438 +0000 UTC m=+3.606794878,LastTimestamp:2026-03-16 00:06:35.006696438 +0000 UTC m=+3.606794878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.458476 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe7bb9fb4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.018215348 +0000 UTC m=+3.618313778,LastTimestamp:2026-03-16 00:06:35.018215348 +0000 UTC m=+3.618313778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.464565 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe7ca13ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.01916254 +0000 UTC m=+3.619260980,LastTimestamp:2026-03-16 00:06:35.01916254 +0000 UTC m=+3.619260980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.471187 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299fee514728 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.128686376 +0000 UTC m=+3.728784806,LastTimestamp:2026-03-16 00:06:35.128686376 +0000 UTC m=+3.728784806,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.477555 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff408769c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.224577692 +0000 UTC m=+3.824676122,LastTimestamp:2026-03-16 00:06:35.224577692 +0000 UTC m=+3.824676122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.483691 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff4ee60bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.239645372 +0000 UTC m=+3.839743812,LastTimestamp:2026-03-16 00:06:35.239645372 +0000 UTC m=+3.839743812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.490641 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299ffaa912d9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.335766745 +0000 UTC m=+3.935865175,LastTimestamp:2026-03-16 00:06:35.335766745 +0000 UTC m=+3.935865175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.496013 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d299ffb9eafdc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.35186326 +0000 UTC m=+3.951961690,LastTimestamp:2026-03-16 00:06:35.35186326 +0000 UTC m=+3.951961690,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.502882 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a02b877192 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.155646354 +0000 UTC m=+4.755744814,LastTimestamp:2026-03-16 00:06:36.155646354 +0000 UTC m=+4.755744814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.509531 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a03ad871e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.412613091 +0000 UTC m=+5.012711531,LastTimestamp:2026-03-16 00:06:36.412613091 +0000 UTC m=+5.012711531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.514829 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a03ba5fcb3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.426083507 +0000 UTC m=+5.026181937,LastTimestamp:2026-03-16 00:06:36.426083507 +0000 UTC m=+5.026181937,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.519629 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a03bb8133f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.427268927 +0000 UTC m=+5.027367357,LastTimestamp:2026-03-16 00:06:36.427268927 +0000 UTC m=+5.027367357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.523857 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a04b329398 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.686955416 +0000 UTC m=+5.287053846,LastTimestamp:2026-03-16 00:06:36.686955416 +0000 UTC m=+5.287053846,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.529132 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a04c514f9a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.705746842 +0000 UTC m=+5.305845302,LastTimestamp:2026-03-16 00:06:36.705746842 +0000 UTC m=+5.305845302,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.534209 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a04c6faebf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.707737279 +0000 UTC m=+5.307835739,LastTimestamp:2026-03-16 00:06:36.707737279 +0000 UTC m=+5.307835739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.539120 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a05ccca4d0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.98226504 +0000 UTC m=+5.582363500,LastTimestamp:2026-03-16 00:06:36.98226504 +0000 UTC m=+5.582363500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.543696 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a05d827a02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.994181634 +0000 UTC m=+5.594280094,LastTimestamp:2026-03-16 00:06:36.994181634 +0000 UTC m=+5.594280094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.549003 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a05d997feb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:36.995690475 +0000 UTC m=+5.595788905,LastTimestamp:2026-03-16 00:06:36.995690475 +0000 UTC m=+5.595788905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.554408 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a06e45ff24 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.275430692 +0000 UTC m=+5.875529112,LastTimestamp:2026-03-16 00:06:37.275430692 +0000 UTC m=+5.875529112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.559151 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a06f81c9c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.296126403 +0000 UTC m=+5.896224833,LastTimestamp:2026-03-16 00:06:37.296126403 +0000 UTC m=+5.896224833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.563962 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a06f9894ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.297620154 +0000 UTC m=+5.897718584,LastTimestamp:2026-03-16 00:06:37.297620154 +0000 UTC m=+5.897718584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.569220 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a07d13e65d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.523805789 +0000 UTC m=+6.123904229,LastTimestamp:2026-03-16 00:06:37.523805789 +0000 UTC m=+6.123904229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.573608 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189d29a07dc7e63e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:37.535602238 +0000 UTC m=+6.135700678,LastTimestamp:2026-03-16 00:06:37.535602238 +0000 UTC m=+6.135700678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.583230 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500d0186 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:06:56 crc kubenswrapper[4983]: body: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358313862 +0000 UTC m=+13.958412292,LastTimestamp:2026-03-16 00:06:45.358313862 +0000 UTC m=+13.958412292,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: W0316 00:06:56.588745 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.588856 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.588831 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500dc87a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358364794 +0000 UTC m=+13.958463224,LastTimestamp:2026-03-16 00:06:45.358364794 +0000 UTC m=+13.958463224,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.595844 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a2762f9bde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:06:56 crc kubenswrapper[4983]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:56 crc kubenswrapper[4983]: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998115806 +0000 UTC m=+14.598214246,LastTimestamp:2026-03-16 00:06:45.998115806 +0000 UTC m=+14.598214246,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.602686 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a276305972 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998164338 +0000 UTC m=+14.598262768,LastTimestamp:2026-03-16 00:06:45.998164338 +0000 UTC m=+14.598262768,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.610863 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a2762f9bde\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-apiserver-crc.189d29a2762f9bde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 16 00:06:56 crc kubenswrapper[4983]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 16 00:06:56 crc kubenswrapper[4983]: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998115806 +0000 UTC m=+14.598214246,LastTimestamp:2026-03-16 00:06:46.004165431 +0000 UTC m=+14.604263851,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.618628 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d29a276305972\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d29a276305972 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.998164338 +0000 UTC m=+14.598262768,LastTimestamp:2026-03-16 00:06:46.004201442 +0000 UTC m=+14.604299872,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.624308 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d299fe7ca13ac\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299fe7ca13ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.01916254 +0000 UTC m=+3.619260980,LastTimestamp:2026-03-16 00:06:46.206131756 +0000 UTC m=+14.806230196,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.629738 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d299ff408769c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff408769c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.224577692 +0000 UTC m=+3.824676122,LastTimestamp:2026-03-16 00:06:46.405514264 +0000 UTC m=+15.005612704,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.635866 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189d299ff4ee60bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189d299ff4ee60bc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:35.239645372 +0000 UTC m=+3.839743812,LastTimestamp:2026-03-16 00:06:46.413881148 +0000 UTC m=+15.013979598,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.644791 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a2500d0186\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:06:56 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500d0186 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 16 00:06:56 crc kubenswrapper[4983]: body: Mar 16 00:06:56 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358313862 +0000 UTC m=+13.958412292,LastTimestamp:2026-03-16 00:06:55.359600166 +0000 UTC m=+23.959698636,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:06:56 crc kubenswrapper[4983]: > Mar 16 00:06:56 crc kubenswrapper[4983]: E0316 00:06:56.649874 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d29a2500dc87a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a2500dc87a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:45.358364794 +0000 UTC m=+13.958463224,LastTimestamp:2026-03-16 00:06:55.359685718 +0000 UTC m=+23.959784198,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:06:56 crc kubenswrapper[4983]: I0316 00:06:56.906853 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.032171 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.245183 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.246937 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.247015 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.247041 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:57 crc kubenswrapper[4983]: I0316 00:06:57.248026 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.033812 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.255928 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.260310 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776"} Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.260605 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.262155 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.262222 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:58 crc kubenswrapper[4983]: I0316 00:06:58.262245 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.034959 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:06:59 crc kubenswrapper[4983]: W0316 00:06:59.142508 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.142606 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.266669 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.268060 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.270902 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" exitCode=255 Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.270973 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776"} Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.271072 4983 scope.go:117] "RemoveContainer" containerID="4a0bf0ba7285d119cb096b609ac2808177133d897cb506ce82de3bfa026809d0" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.271267 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.272687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.272799 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.272828 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.274221 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.274717 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.416141 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419413 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:06:59 crc kubenswrapper[4983]: I0316 00:06:59.419463 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.422696 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:06:59 crc kubenswrapper[4983]: E0316 00:06:59.423189 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:00 crc kubenswrapper[4983]: I0316 00:07:00.032089 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:00 crc kubenswrapper[4983]: I0316 00:07:00.275813 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:01 crc kubenswrapper[4983]: I0316 00:07:01.030124 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:01 crc kubenswrapper[4983]: W0316 00:07:01.116446 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 16 00:07:01 crc kubenswrapper[4983]: E0316 00:07:01.116500 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:02 crc kubenswrapper[4983]: I0316 00:07:02.033179 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:02 crc kubenswrapper[4983]: E0316 00:07:02.168009 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:02 crc kubenswrapper[4983]: W0316 00:07:02.545712 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:02 crc kubenswrapper[4983]: E0316 00:07:02.546096 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:03 crc kubenswrapper[4983]: I0316 00:07:03.030562 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.032824 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398237 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398338 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398434 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.398660 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.400531 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.400595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.400621 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.401485 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 16 00:07:04 crc kubenswrapper[4983]: I0316 00:07:04.401866 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272" gracePeriod=30 Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.407607 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 16 00:07:04 crc kubenswrapper[4983]: &Event{ObjectMeta:{kube-controller-manager-crc.189d29a6beec563f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer Mar 16 00:07:04 crc kubenswrapper[4983]: body: Mar 16 00:07:04 crc kubenswrapper[4983]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:04.398313023 +0000 UTC m=+32.998411493,LastTimestamp:2026-03-16 00:07:04.398313023 +0000 UTC m=+32.998411493,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 16 00:07:04 crc kubenswrapper[4983]: > Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.410032 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a6beed6bb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55000->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:04.398384054 +0000 UTC m=+32.998482524,LastTimestamp:2026-03-16 00:07:04.398384054 +0000 UTC m=+32.998482524,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.413312 4983 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d29a6bf2205a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:07:04.40183133 +0000 UTC m=+33.001929840,LastTimestamp:2026-03-16 00:07:04.40183133 +0000 UTC m=+33.001929840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.421264 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d299f7b92600e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f7b92600e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.20357275 +0000 UTC m=+1.803671180,LastTimestamp:2026-03-16 00:07:04.420139925 +0000 UTC m=+33.020238395,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.605998 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d299f8d2fae19\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8d2fae19 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.499094553 +0000 UTC m=+2.099193013,LastTimestamp:2026-03-16 00:07:04.600028291 +0000 UTC m=+33.200126721,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:04 crc kubenswrapper[4983]: E0316 00:07:04.614142 4983 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189d299f8e097a06\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189d299f8e097a06 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:06:33.51336807 +0000 UTC m=+2.113466530,LastTimestamp:2026-03-16 00:07:04.610078051 +0000 UTC m=+33.210176501,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.032278 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.294359 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.294993 4983 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272" exitCode=255 Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.295041 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272"} Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.295086 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907"} Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.295211 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.296442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.296507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:05 crc kubenswrapper[4983]: I0316 00:07:05.296525 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.031656 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.300951 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.302134 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.303554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.303628 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.303653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.304645 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:06 crc kubenswrapper[4983]: E0316 00:07:06.305017 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.423209 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425191 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.425222 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:06 crc kubenswrapper[4983]: E0316 00:07:06.431941 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:06 crc kubenswrapper[4983]: E0316 00:07:06.431979 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:06 crc kubenswrapper[4983]: I0316 00:07:06.907239 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.034280 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.300203 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301109 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301157 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301194 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.301881 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:07 crc kubenswrapper[4983]: E0316 00:07:07.302103 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.312791 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.312902 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.313687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.313875 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:07 crc kubenswrapper[4983]: I0316 00:07:07.314006 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:08 crc kubenswrapper[4983]: I0316 00:07:08.031084 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:09 crc kubenswrapper[4983]: I0316 00:07:09.033968 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:10 crc kubenswrapper[4983]: I0316 00:07:10.029967 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:11 crc kubenswrapper[4983]: I0316 00:07:11.031353 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.030444 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:12 crc kubenswrapper[4983]: E0316 00:07:12.168138 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.358235 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.358527 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.360262 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.360334 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:12 crc kubenswrapper[4983]: I0316 00:07:12.360356 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.032382 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.432559 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434039 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434102 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434127 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:13 crc kubenswrapper[4983]: I0316 00:07:13.434169 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:13 crc kubenswrapper[4983]: E0316 00:07:13.437285 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:13 crc kubenswrapper[4983]: E0316 00:07:13.438491 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:14 crc kubenswrapper[4983]: I0316 00:07:14.033655 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:14 crc kubenswrapper[4983]: W0316 00:07:14.975957 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 16 00:07:14 crc kubenswrapper[4983]: E0316 00:07:14.976025 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.032136 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.131309 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.131516 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.134228 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.134258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.134268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.136953 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320041 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320934 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320964 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:15 crc kubenswrapper[4983]: I0316 00:07:15.320973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:16 crc kubenswrapper[4983]: I0316 00:07:16.030638 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:17 crc kubenswrapper[4983]: I0316 00:07:17.034545 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: W0316 00:07:18.003508 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: E0316 00:07:18.003948 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.031246 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.092125 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.093601 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.093877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.094089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:18 crc kubenswrapper[4983]: I0316 00:07:18.095286 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:18 crc kubenswrapper[4983]: E0316 00:07:18.095842 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:18 crc kubenswrapper[4983]: W0316 00:07:18.284971 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 16 00:07:18 crc kubenswrapper[4983]: E0316 00:07:18.285024 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:19 crc kubenswrapper[4983]: I0316 00:07:19.032376 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.027707 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.437888 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439603 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439636 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439646 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:20 crc kubenswrapper[4983]: I0316 00:07:20.439670 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:20 crc kubenswrapper[4983]: E0316 00:07:20.445608 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:20 crc kubenswrapper[4983]: E0316 00:07:20.445881 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:21 crc kubenswrapper[4983]: I0316 00:07:21.031448 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:22 crc kubenswrapper[4983]: I0316 00:07:22.030781 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:22 crc kubenswrapper[4983]: E0316 00:07:22.168276 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:23 crc kubenswrapper[4983]: I0316 00:07:23.030929 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:24 crc kubenswrapper[4983]: I0316 00:07:24.030506 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:25 crc kubenswrapper[4983]: I0316 00:07:25.031715 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:26 crc kubenswrapper[4983]: I0316 00:07:26.033213 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:26 crc kubenswrapper[4983]: W0316 00:07:26.387338 4983 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 16 00:07:26 crc kubenswrapper[4983]: E0316 00:07:26.387407 4983 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.022845 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.023429 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.025021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.025058 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.025071 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.033610 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.446282 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448227 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:27 crc kubenswrapper[4983]: I0316 00:07:27.448295 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:27 crc kubenswrapper[4983]: E0316 00:07:27.450609 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:27 crc kubenswrapper[4983]: E0316 00:07:27.450682 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:28 crc kubenswrapper[4983]: I0316 00:07:28.033248 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:29 crc kubenswrapper[4983]: I0316 00:07:29.034975 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:30 crc kubenswrapper[4983]: I0316 00:07:30.034152 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:31 crc kubenswrapper[4983]: I0316 00:07:31.035835 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.037960 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.091681 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.093260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.093665 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.093687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.094687 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:32 crc kubenswrapper[4983]: E0316 00:07:32.168379 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.367545 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.369362 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd"} Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.369545 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.370479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.370564 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:32 crc kubenswrapper[4983]: I0316 00:07:32.370622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.030716 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.374229 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.376134 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.378734 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" exitCode=255 Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.378825 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd"} Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.378877 4983 scope.go:117] "RemoveContainer" containerID="ec1a8491a358ee22066a6a27e1488e9aea78dfa2073d73ef8fa2f5cfe1d71776" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.379090 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.380527 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.380578 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.380595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:33 crc kubenswrapper[4983]: I0316 00:07:33.381469 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:33 crc kubenswrapper[4983]: E0316 00:07:33.381879 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.037566 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.385178 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.451159 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452174 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:34 crc kubenswrapper[4983]: I0316 00:07:34.452299 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:34 crc kubenswrapper[4983]: E0316 00:07:34.458506 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 16 00:07:34 crc kubenswrapper[4983]: E0316 00:07:34.458736 4983 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 16 00:07:35 crc kubenswrapper[4983]: I0316 00:07:35.036157 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.031973 4983 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.300800 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.301123 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.302896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.302948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.302967 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.303848 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:36 crc kubenswrapper[4983]: E0316 00:07:36.304153 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.540850 4983 csr.go:261] certificate signing request csr-tjcxq is approved, waiting to be issued Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.551877 4983 csr.go:257] certificate signing request csr-tjcxq is issued Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.652463 4983 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.871143 4983 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.906910 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.907506 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.909865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.909970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.910250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:36 crc kubenswrapper[4983]: I0316 00:07:36.911113 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:36 crc kubenswrapper[4983]: E0316 00:07:36.911448 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:37 crc kubenswrapper[4983]: I0316 00:07:37.553523 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-18 11:41:22.468164912 +0000 UTC Mar 16 00:07:37 crc kubenswrapper[4983]: I0316 00:07:37.554464 4983 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 5939h33m44.913709387s for next certificate rotation Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.459821 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461114 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461221 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.461323 4983 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.472553 4983 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.472844 4983 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.472876 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476059 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476133 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476159 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.476177 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.499103 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505664 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505707 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505724 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.505736 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.515200 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.524969 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525004 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525034 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.525047 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.540073 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547543 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547552 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547568 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:41 crc kubenswrapper[4983]: I0316 00:07:41.547580 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:41Z","lastTransitionTime":"2026-03-16T00:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.560419 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.560544 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.560570 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.661226 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.762123 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.862663 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:41 crc kubenswrapper[4983]: E0316 00:07:41.963735 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.063984 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.164296 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.168479 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.264426 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.365536 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.466697 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.567047 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.668140 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.768524 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.869670 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:42 crc kubenswrapper[4983]: E0316 00:07:42.970154 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.071260 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.172385 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.273574 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.374249 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.474486 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.575546 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.675716 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.776427 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.877485 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:43 crc kubenswrapper[4983]: E0316 00:07:43.978052 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.078685 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.179420 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.280580 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.380689 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.481597 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.582111 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.683194 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.783325 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.883529 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:44 crc kubenswrapper[4983]: E0316 00:07:44.984120 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.084577 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.185635 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.286459 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.387469 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.488624 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.589646 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.689853 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.790274 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.890839 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:45 crc kubenswrapper[4983]: E0316 00:07:45.991881 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.092154 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.192315 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.293448 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.393607 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.494205 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.594567 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.694842 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.795680 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.896696 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:46 crc kubenswrapper[4983]: E0316 00:07:46.997321 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.098362 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.198478 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.299514 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.400614 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.501277 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.601835 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.702961 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.804134 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:47 crc kubenswrapper[4983]: E0316 00:07:47.905016 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.005165 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.106242 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.207344 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.307815 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.408827 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.509887 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.610137 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.710303 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.811372 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:48 crc kubenswrapper[4983]: E0316 00:07:48.911906 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.012072 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.113092 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.213246 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.314349 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.414898 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.515460 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.616530 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.717390 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.818462 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:49 crc kubenswrapper[4983]: E0316 00:07:49.919569 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.020588 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.092237 4983 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.093775 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.093826 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.093836 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:50 crc kubenswrapper[4983]: I0316 00:07:50.094409 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.094602 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.121395 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.221797 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.322452 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.423438 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.524131 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.624456 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.725543 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.825872 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:50 crc kubenswrapper[4983]: E0316 00:07:50.926398 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.026861 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.127852 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.228530 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.329068 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.429794 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.530473 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.630610 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.731182 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.737977 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738054 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738079 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.738136 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.756510 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765538 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765556 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765582 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.765601 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.783585 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794164 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794222 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794241 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794265 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.794283 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.810172 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817651 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817702 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817720 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:51 crc kubenswrapper[4983]: I0316 00:07:51.817731 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:51Z","lastTransitionTime":"2026-03-16T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.827881 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.828015 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.828033 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:51 crc kubenswrapper[4983]: E0316 00:07:51.928805 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.029254 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.130234 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.169624 4983 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.230941 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.331736 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.432194 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.532469 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.633568 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.733744 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: E0316 00:07:52.834971 4983 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.843746 4983 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938168 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938267 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:52 crc kubenswrapper[4983]: I0316 00:07:52.938285 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:52Z","lastTransitionTime":"2026-03-16T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.012453 4983 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041697 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041789 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041805 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041827 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.041874 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.047937 4983 apiserver.go:52] "Watching apiserver" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.056462 4983 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.057908 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf","openshift-ovn-kubernetes/ovnkube-node-wsfb4","openshift-machine-config-operator/machine-config-daemon-7sbnj","openshift-multus/network-metrics-daemon-qvtjp","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-operator/iptables-alerter-4ln5h","openshift-dns/node-resolver-v748m","openshift-multus/multus-tqncp","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-image-registry/node-ca-d2h5k","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-multus/multus-additional-cni-plugins-pp6bs"] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.059984 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060447 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.060552 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060650 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060782 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.060853 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.060934 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.060993 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.061272 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.062134 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.062474 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.063061 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.063540 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.063583 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.063645 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.064208 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.065705 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.066221 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.066436 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.066739 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.067108 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.068380 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.070945 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.070921 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.073622 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074103 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074209 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074641 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.074956 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075127 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075370 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075543 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075408 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.075923 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.076522 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.076741 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.076984 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077177 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077380 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077492 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077662 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.077915 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078072 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078225 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078332 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078437 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078342 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078641 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.078377 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.079685 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.079748 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.080067 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.080114 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.093448 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.109830 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.122000 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.135549 4983 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.137574 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144362 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144440 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.144455 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.152559 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161379 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161417 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161439 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161459 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161479 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161500 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161543 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161563 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161584 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161688 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161712 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161732 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161865 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161890 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161909 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161931 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161950 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161976 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.161997 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162017 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162038 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162058 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162077 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162097 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.162152 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.662102549 +0000 UTC m=+82.262201019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162226 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162297 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162332 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162519 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162620 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162690 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162738 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162833 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.162884 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163012 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163015 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163084 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163188 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163246 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163272 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163302 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163352 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163379 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163405 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163432 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163463 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163484 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163516 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163568 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163617 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163670 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163742 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163862 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163920 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.163974 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164022 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164065 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164069 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164144 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164167 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164189 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164212 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164235 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164231 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164256 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164279 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164299 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164253 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164358 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164408 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164451 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164473 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164496 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164519 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164696 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164741 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164789 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164825 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164847 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164837 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164868 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164888 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164908 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164929 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164952 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164972 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.164992 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165011 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165033 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165038 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165052 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165054 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165074 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165167 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165225 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165432 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165497 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165589 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165636 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165694 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165723 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165751 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165816 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165841 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165894 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165942 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.165996 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166045 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166099 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166109 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166148 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166215 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166278 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166330 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166383 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166435 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166492 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166542 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166592 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166643 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166694 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166750 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166853 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166908 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166959 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167010 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167064 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168926 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168987 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169029 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169267 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169334 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169387 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169434 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169488 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169588 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169647 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169701 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169797 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169869 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169922 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169973 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170031 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170088 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170143 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170197 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170252 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170294 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170332 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170429 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170465 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170502 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170549 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170602 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171439 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171521 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171560 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171627 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171660 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171731 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172166 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172188 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172205 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172535 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172551 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172587 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172605 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172624 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172642 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172682 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172699 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172716 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172760 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172778 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172795 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172811 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172850 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172867 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172883 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172901 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172933 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172951 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172967 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173006 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173023 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173041 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173058 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173148 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173172 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173261 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173283 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173301 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173316 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173351 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173369 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173387 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173419 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173436 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173454 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173469 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173550 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173597 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173705 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173721 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174421 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174468 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174509 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174613 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174651 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174688 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174724 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174858 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174953 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbxh\" (UniqueName: \"kubernetes.io/projected/43da17ff-aed1-44a2-a154-6800c3dd6ca9-kube-api-access-trbxh\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174993 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175032 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-system-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175086 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-cni-binary-copy\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175133 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-conf-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175187 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48a48757-a3b8-4d4d-92ba-6a2459a26a86-mcd-auth-proxy-config\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175236 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175284 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-os-release\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175330 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/48a48757-a3b8-4d4d-92ba-6a2459a26a86-rootfs\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175377 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6740e33-489f-4f45-b3e5-fdceaebf4301-hosts-file\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175475 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48a48757-a3b8-4d4d-92ba-6a2459a26a86-proxy-tls\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175520 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175562 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175604 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-netns\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175650 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175700 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92wtj\" (UniqueName: \"kubernetes.io/projected/9138d88d-b777-4cab-b3d2-2099f01b205b-kube-api-access-92wtj\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175799 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175851 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-kubelet\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175909 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175975 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176025 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176073 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176116 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176148 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176194 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-system-cni-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176241 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-binary-copy\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176291 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176339 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176473 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176528 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-multus-certs\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176578 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l24hn\" (UniqueName: \"kubernetes.io/projected/6993dda4-ac10-47af-b406-d49d7781fbe5-kube-api-access-l24hn\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176626 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-multus\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176673 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w6zq\" (UniqueName: \"kubernetes.io/projected/48a48757-a3b8-4d4d-92ba-6a2459a26a86-kube-api-access-5w6zq\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176720 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176803 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176863 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176917 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177034 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-socket-dir-parent\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177090 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-daemon-config\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177136 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjz9m\" (UniqueName: \"kubernetes.io/projected/f81ec143-6c51-4f96-ae71-a4759bac7c70-kube-api-access-gjz9m\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177184 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177226 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166437 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166474 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166470 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166838 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166842 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166822 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166865 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167315 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167373 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167576 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167637 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167810 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.167980 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168202 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168235 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168480 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.168504 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169121 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169521 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169659 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169729 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169722 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169894 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.169994 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170012 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170042 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170638 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.170793 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171079 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171123 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171348 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.171499 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172380 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.172604 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173406 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.173487 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174035 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174278 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174340 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174348 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174355 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174806 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.174972 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175451 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175458 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175715 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175797 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175481 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.175915 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176070 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176287 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176372 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.176560 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177116 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.178091 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.178220 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.178851 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179019 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179035 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179082 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179432 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.179574 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180037 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180289 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180293 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180442 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180617 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180829 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.180879 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181132 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181231 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181245 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181529 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181621 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.181934 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182390 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182529 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182833 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.182867 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.183284 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.183436 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.183963 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184007 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184145 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184424 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184871 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.184914 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185038 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185060 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185156 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185265 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185353 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185075 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185540 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185695 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.185924 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.166704 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186092 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186554 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186572 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186646 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.186923 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187074 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187498 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187585 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.187617 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.189304 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.189395 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.689370335 +0000 UTC m=+82.289468805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.190233 4983 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.202716 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203034 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203178 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203520 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203580 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203604 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203601 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203637 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203691 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.177276 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-cnibin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203904 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.203985 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-bin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204043 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204052 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204116 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-etc-kubernetes\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204322 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/607f8329-b349-45da-bb9b-785740b4ad4f-serviceca\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.204371 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.205795 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.205862 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.205921 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206059 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206210 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206387 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206433 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-cnibin\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206479 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-hostroot\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206523 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206571 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206621 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206860 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206911 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206955 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207001 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207031 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-os-release\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207065 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-k8s-cni-cncf-io\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207101 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8sn\" (UniqueName: \"kubernetes.io/projected/b6740e33-489f-4f45-b3e5-fdceaebf4301-kube-api-access-6g8sn\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207137 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207172 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207214 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207253 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207296 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207332 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/607f8329-b349-45da-bb9b-785740b4ad4f-host\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207355 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207387 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207422 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncxpw\" (UniqueName: \"kubernetes.io/projected/607f8329-b349-45da-bb9b-785740b4ad4f-kube-api-access-ncxpw\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207563 4983 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207581 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207596 4983 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207610 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207629 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207643 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207659 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207672 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207690 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207704 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207718 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207735 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207749 4983 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207791 4983 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207811 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207837 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207856 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207875 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207892 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207918 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207937 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207958 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207977 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208000 4983 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208017 4983 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208035 4983 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208157 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208179 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208197 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208214 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208237 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208328 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208351 4983 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208376 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208402 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208424 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208527 4983 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208925 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208950 4983 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208969 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208987 4983 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209011 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209032 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209050 4983 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209068 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209094 4983 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209113 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209130 4983 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209147 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209171 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209192 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209210 4983 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209234 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209251 4983 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209269 4983 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209288 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209311 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209330 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209347 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209366 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209388 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209406 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209424 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209447 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209465 4983 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209519 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209537 4983 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209560 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209578 4983 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209597 4983 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209616 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209638 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209656 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209671 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209687 4983 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209709 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209725 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209741 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209790 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209809 4983 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209824 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209843 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209865 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209880 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209895 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209920 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209946 4983 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209962 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209979 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209995 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210015 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210031 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210048 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210071 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210089 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210106 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210121 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210144 4983 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210161 4983 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210178 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210193 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210213 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210230 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210247 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210269 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210287 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210304 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210322 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210344 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210361 4983 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210377 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210394 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210415 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210432 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210448 4983 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210465 4983 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210487 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210506 4983 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210594 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210624 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210647 4983 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210667 4983 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210687 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210712 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210732 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210751 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210793 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210817 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210836 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210859 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210881 4983 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210906 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210925 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210941 4983 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210969 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210992 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214162 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206409 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206426 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206451 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.202561 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206907 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206928 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.206963 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207262 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207292 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.207333 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.207917 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.217426 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.217449 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.217585 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.717548592 +0000 UTC m=+82.317647032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208354 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208408 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208425 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.208908 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.209251 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210047 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210080 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210561 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.210734 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211553 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211573 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211802 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.211969 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.212081 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.212400 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.212716 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.213058 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.213141 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.213360 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214009 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214046 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.214234 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.215231 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.215327 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.216289 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.218172 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.218663 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.220345 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.720317762 +0000 UTC m=+82.320416202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.222291 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.226309 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.226355 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.227311 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.228783 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.229054 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.230009 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.230069 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.230142 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.231033 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.233020 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.235520 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240355 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240378 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240392 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.240446 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.740424836 +0000 UTC m=+82.340523276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.244495 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.245532 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.245836 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.247991 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.248566 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250163 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250218 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250229 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.250876 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.252026 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.252513 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.254102 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.256801 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.256979 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.257624 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.257984 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.258735 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.259613 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.259905 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.260308 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.260446 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.265786 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.266466 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.270599 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.275494 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.285662 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.293667 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.300295 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncxpw\" (UniqueName: \"kubernetes.io/projected/607f8329-b349-45da-bb9b-785740b4ad4f-kube-api-access-ncxpw\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311687 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311704 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311721 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-system-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311736 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trbxh\" (UniqueName: \"kubernetes.io/projected/43da17ff-aed1-44a2-a154-6800c3dd6ca9-kube-api-access-trbxh\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311775 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311806 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-os-release\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311827 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-cni-binary-copy\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311847 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-conf-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311854 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-system-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311872 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48a48757-a3b8-4d4d-92ba-6a2459a26a86-mcd-auth-proxy-config\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311875 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311934 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6740e33-489f-4f45-b3e5-fdceaebf4301-hosts-file\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311905 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b6740e33-489f-4f45-b3e5-fdceaebf4301-hosts-file\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311982 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311990 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312008 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312034 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/48a48757-a3b8-4d4d-92ba-6a2459a26a86-rootfs\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312043 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.311988 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-os-release\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312065 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312148 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312102 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/48a48757-a3b8-4d4d-92ba-6a2459a26a86-rootfs\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312164 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312202 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312206 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-netns\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312228 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-netns\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312084 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-conf-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312254 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48a48757-a3b8-4d4d-92ba-6a2459a26a86-proxy-tls\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312286 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-kubelet\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312321 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312371 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312400 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312427 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312457 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92wtj\" (UniqueName: \"kubernetes.io/projected/9138d88d-b777-4cab-b3d2-2099f01b205b-kube-api-access-92wtj\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312486 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-system-cni-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312512 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-binary-copy\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312543 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312569 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312596 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312625 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-multus-certs\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312641 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/48a48757-a3b8-4d4d-92ba-6a2459a26a86-mcd-auth-proxy-config\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312656 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312779 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312830 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312871 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l24hn\" (UniqueName: \"kubernetes.io/projected/6993dda4-ac10-47af-b406-d49d7781fbe5-kube-api-access-l24hn\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312913 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-multus\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312929 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w6zq\" (UniqueName: \"kubernetes.io/projected/48a48757-a3b8-4d4d-92ba-6a2459a26a86-kube-api-access-5w6zq\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312971 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjz9m\" (UniqueName: \"kubernetes.io/projected/f81ec143-6c51-4f96-ae71-a4759bac7c70-kube-api-access-gjz9m\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312992 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312883 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-system-cni-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313029 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313050 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-kubelet\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.312983 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313375 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313484 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-socket-dir-parent\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313522 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-daemon-config\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313612 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-binary-copy\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313622 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/607f8329-b349-45da-bb9b-785740b4ad4f-serviceca\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313679 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313688 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-multus-certs\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313705 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-cnibin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313737 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-cnibin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313749 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-bin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313783 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313814 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-cni-binary-copy\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.313997 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314033 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314058 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-etc-kubernetes\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314469 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314504 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314522 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314538 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-cnibin\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314611 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-hostroot\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314628 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314662 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-os-release\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314678 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-k8s-cni-cncf-io\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314698 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8sn\" (UniqueName: \"kubernetes.io/projected/b6740e33-489f-4f45-b3e5-fdceaebf4301-kube-api-access-6g8sn\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314715 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314937 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314969 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.314996 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-bin\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315024 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-cnibin\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315068 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-socket-dir-parent\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315048 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315316 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-etc-kubernetes\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315338 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315361 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-hostroot\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315423 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315469 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-os-release\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315498 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-run-k8s-cni-cncf-io\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315581 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315668 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.315693 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-host-var-lib-cni-multus\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316066 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9138d88d-b777-4cab-b3d2-2099f01b205b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316178 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/607f8329-b349-45da-bb9b-785740b4ad4f-host\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316242 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316245 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-daemon-config\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316267 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/607f8329-b349-45da-bb9b-785740b4ad4f-host\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316313 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316320 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316350 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316628 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.316824 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.316896 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:53.816873802 +0000 UTC m=+82.416972282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.316980 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9138d88d-b777-4cab-b3d2-2099f01b205b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317143 4983 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317149 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81ec143-6c51-4f96-ae71-a4759bac7c70-multus-cni-dir\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317161 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317175 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317185 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317194 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317203 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317212 4983 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317221 4983 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317230 4983 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317239 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317246 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317255 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317266 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317274 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317285 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317294 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317302 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317312 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317321 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317329 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317337 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317346 4983 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317354 4983 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317362 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317370 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317395 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317406 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317417 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317427 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317439 4983 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317453 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317463 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317472 4983 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317480 4983 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317490 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317839 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317851 4983 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.317861 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326396 4983 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326423 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326433 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326442 4983 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326454 4983 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326463 4983 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326471 4983 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326480 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326488 4983 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326499 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326508 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326517 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326526 4983 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326535 4983 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326545 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326555 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326563 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326574 4983 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326582 4983 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326591 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326800 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326966 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/607f8329-b349-45da-bb9b-785740b4ad4f-serviceca\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.326921 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.330124 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.330605 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjz9m\" (UniqueName: \"kubernetes.io/projected/f81ec143-6c51-4f96-ae71-a4759bac7c70-kube-api-access-gjz9m\") pod \"multus-tqncp\" (UID: \"f81ec143-6c51-4f96-ae71-a4759bac7c70\") " pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.331146 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8sn\" (UniqueName: \"kubernetes.io/projected/b6740e33-489f-4f45-b3e5-fdceaebf4301-kube-api-access-6g8sn\") pod \"node-resolver-v748m\" (UID: \"b6740e33-489f-4f45-b3e5-fdceaebf4301\") " pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.331385 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/43da17ff-aed1-44a2-a154-6800c3dd6ca9-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.333504 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92wtj\" (UniqueName: \"kubernetes.io/projected/9138d88d-b777-4cab-b3d2-2099f01b205b-kube-api-access-92wtj\") pod \"multus-additional-cni-plugins-pp6bs\" (UID: \"9138d88d-b777-4cab-b3d2-2099f01b205b\") " pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.334364 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l24hn\" (UniqueName: \"kubernetes.io/projected/6993dda4-ac10-47af-b406-d49d7781fbe5-kube-api-access-l24hn\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.335913 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.336911 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.338112 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/48a48757-a3b8-4d4d-92ba-6a2459a26a86-proxy-tls\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.339706 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"ovnkube-node-wsfb4\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.340120 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trbxh\" (UniqueName: \"kubernetes.io/projected/43da17ff-aed1-44a2-a154-6800c3dd6ca9-kube-api-access-trbxh\") pod \"ovnkube-control-plane-749d76644c-hjpzf\" (UID: \"43da17ff-aed1-44a2-a154-6800c3dd6ca9\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.340159 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w6zq\" (UniqueName: \"kubernetes.io/projected/48a48757-a3b8-4d4d-92ba-6a2459a26a86-kube-api-access-5w6zq\") pod \"machine-config-daemon-7sbnj\" (UID: \"48a48757-a3b8-4d4d-92ba-6a2459a26a86\") " pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.341134 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncxpw\" (UniqueName: \"kubernetes.io/projected/607f8329-b349-45da-bb9b-785740b4ad4f-kube-api-access-ncxpw\") pod \"node-ca-d2h5k\" (UID: \"607f8329-b349-45da-bb9b-785740b4ad4f\") " pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.347055 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353708 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353727 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.353742 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.385716 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.400171 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2h5k" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.416913 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.420071 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607f8329_b349_45da_bb9b_785740b4ad4f.slice/crio-588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d WatchSource:0}: Error finding container 588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d: Status 404 returned error can't find the container with id 588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.430939 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.440834 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tqncp" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.443662 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7507711e48881fcf269110901d0d8312cd087b99aada9b1d0b2b78795cb41a45"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.446083 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2h5k" event={"ID":"607f8329-b349-45da-bb9b-785740b4ad4f","Type":"ContainerStarted","Data":"588bdc17d12b0aa1813488ca2356b44515247409ae93e91e541d5edf68b51c3d"} Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.446284 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12 WatchSource:0}: Error finding container 9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12: Status 404 returned error can't find the container with id 9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.447065 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.449562 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"28f7cda51e847143b66e94c1076dac6245ded9a97d4156b4bc50bbecefb4643c"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.456027 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v748m" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457227 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.457241 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.465745 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81ec143_6c51_4f96_ae71_a4759bac7c70.slice/crio-6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8 WatchSource:0}: Error finding container 6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8: Status 404 returned error can't find the container with id 6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.471138 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.497134 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6740e33_489f_4f45_b3e5_fdceaebf4301.slice/crio-0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da WatchSource:0}: Error finding container 0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da: Status 404 returned error can't find the container with id 0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.504513 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.519785 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf055dad5_7c9b_46a1_a715_34847c30d0cf.slice/crio-a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6 WatchSource:0}: Error finding container a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6: Status 404 returned error can't find the container with id a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6 Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.529026 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43da17ff_aed1_44a2_a154_6800c3dd6ca9.slice/crio-401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730 WatchSource:0}: Error finding container 401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730: Status 404 returned error can't find the container with id 401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730 Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.553882 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559169 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559196 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559357 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.559373 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: W0316 00:07:53.580202 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9138d88d_b777_4cab_b3d2_2099f01b205b.slice/crio-9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d WatchSource:0}: Error finding container 9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d: Status 404 returned error can't find the container with id 9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663344 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.663407 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729461 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729564 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.729624 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.729705 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.729743 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.729729598 +0000 UTC m=+83.329828038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730064 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730085 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730116 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730142 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.73010726 +0000 UTC m=+83.330205710 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730178 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.730167742 +0000 UTC m=+83.330266302 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730201 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.730362 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.730340408 +0000 UTC m=+83.330438828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766701 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766734 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766744 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766776 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.766787 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.830638 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.830676 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830784 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830828 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.830814095 +0000 UTC m=+83.430912525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830886 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830948 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.830966 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: E0316 00:07:53.831047 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:54.831025372 +0000 UTC m=+83.431123802 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869254 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.869278 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971899 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971987 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:53 crc kubenswrapper[4983]: I0316 00:07:53.971995 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:53Z","lastTransitionTime":"2026-03-16T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075073 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075134 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075151 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.075163 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.098249 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.099248 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.100681 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.101588 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.102955 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.103713 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.104566 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.106044 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.106935 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.108199 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.109071 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.110527 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.111185 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.111918 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.113113 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.113966 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.115269 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.115944 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.116691 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.118106 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.118727 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.120359 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.120964 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.121634 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.122286 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.123165 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.123981 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.124924 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.125484 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.125966 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.126823 4983 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.126929 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.128560 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.129471 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.129918 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.131354 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.132389 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.132928 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.133966 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.134620 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.135467 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.136173 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.137196 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.137932 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.138838 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.140631 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.141557 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.142298 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.143203 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.143679 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.144139 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.145137 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.145976 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.147038 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.176974 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177045 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.177060 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279466 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279523 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279538 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279561 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.279578 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382275 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382288 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382304 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.382316 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.455123 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" exitCode=0 Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.455188 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.455264 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.457328 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.459727 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.459772 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.459787 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"336a826201538546d8dcaa8c95a1b3d634f848ac4808afdaa7ffd62c732280c9"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.473229 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2h5k" event={"ID":"607f8329-b349-45da-bb9b-785740b4ad4f","Type":"ContainerStarted","Data":"915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.475574 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.475668 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"6df1ad7a190117d0e59c8a25e7087c31c58bbe58f6ebf81b626cb51fe3232bc8"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.476927 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.478207 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.478267 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.478278 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9d72640b89205babd66f836657a980095e4e77693e562bdc42d6ebc494b1bc12"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.479801 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495" exitCode=0 Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.479890 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.479949 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"9146b0a445a61236b7da18db6ee6e186c1e1013557072d9a5c5a27e2a534fa5d"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.482354 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" event={"ID":"43da17ff-aed1-44a2-a154-6800c3dd6ca9","Type":"ContainerStarted","Data":"78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.482386 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" event={"ID":"43da17ff-aed1-44a2-a154-6800c3dd6ca9","Type":"ContainerStarted","Data":"e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.482396 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" event={"ID":"43da17ff-aed1-44a2-a154-6800c3dd6ca9","Type":"ContainerStarted","Data":"401ef267c7d1bfbcdcf811a82c62c6947a23b08ade4be7715748e3305b896730"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484692 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v748m" event={"ID":"b6740e33-489f-4f45-b3e5-fdceaebf4301","Type":"ContainerStarted","Data":"23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484742 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v748m" event={"ID":"b6740e33-489f-4f45-b3e5-fdceaebf4301","Type":"ContainerStarted","Data":"0344d291246395ab2150363ce8c1b5a06aac354dbc364eac8e634905ec46c3da"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484830 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484878 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484905 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.484917 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.494765 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.512268 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.526219 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.538031 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.553247 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.563644 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.584062 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591101 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591155 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.591164 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.604826 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.616058 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.629460 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.646356 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.657132 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.668041 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.679948 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693875 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693891 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.693901 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.694784 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.707012 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.724862 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.736001 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740226 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740600 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740635 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.740675 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.740798 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.740845 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.740832129 +0000 UTC m=+85.340930559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741176 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741189 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741199 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741227 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.741219892 +0000 UTC m=+85.341318322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741266 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741294 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.741280964 +0000 UTC m=+85.341379394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.741338 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.741331865 +0000 UTC m=+85.341430295 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.749690 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.761077 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.770459 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.781254 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.791104 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795888 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795945 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795962 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.795972 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.802567 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.821012 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.832067 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.841892 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.841960 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842125 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842145 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842191 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842261 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.842223636 +0000 UTC m=+85.442322066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842255 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: E0316 00:07:54.842350 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:07:56.84232693 +0000 UTC m=+85.442425410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.847939 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:54Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.897943 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898266 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898283 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:54 crc kubenswrapper[4983]: I0316 00:07:54.898323 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:54Z","lastTransitionTime":"2026-03-16T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000941 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000966 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.000977 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.091634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.091741 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.092079 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.092152 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.092206 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.092259 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.092309 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:55 crc kubenswrapper[4983]: E0316 00:07:55.092366 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.103559 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.103612 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.103621 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.104167 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.104191 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208558 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208591 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.208602 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.311525 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.311739 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.311873 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.312017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.312126 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415503 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415512 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415527 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.415538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.490187 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492845 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492906 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492927 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492946 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.492963 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.501662 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.512393 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519087 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519165 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519185 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.519201 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.526118 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.537948 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.553136 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.568377 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.587309 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.597215 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.608672 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.619184 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620869 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620906 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620917 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.620940 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.632730 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.643106 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.657375 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.671454 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:55Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723031 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723066 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723074 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.723099 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825331 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825388 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.825444 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.927951 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928015 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928054 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:55 crc kubenswrapper[4983]: I0316 00:07:55.928069 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:55Z","lastTransitionTime":"2026-03-16T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031607 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031678 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031697 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.031711 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135148 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135201 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135216 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.135226 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239342 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239365 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239406 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.239428 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342690 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342729 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342810 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.342835 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446495 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446537 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.446554 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.507918 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.510078 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.512407 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c" exitCode=0 Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.512442 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.529227 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.542068 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550013 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550076 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550104 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.550124 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.559117 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.573623 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.588157 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.608121 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.632443 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.649456 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652617 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652660 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652679 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652702 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.652718 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.665845 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.680354 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.696294 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.710258 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.724797 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.741535 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.753723 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755850 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755885 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.755911 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763472 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763631 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763704 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.763673501 +0000 UTC m=+89.363771971 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763717 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763836 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.763800185 +0000 UTC m=+89.363898615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.763862 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763887 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763924 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763931 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763944 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.763956 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.76394907 +0000 UTC m=+89.364047500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.764005 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.763988051 +0000 UTC m=+89.364086511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.770448 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.784202 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.798496 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.815055 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.826498 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.842378 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.852980 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858432 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858482 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.858511 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.864854 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.864983 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865103 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865127 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865142 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865194 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.865172651 +0000 UTC m=+89.465271101 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865470 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: E0316 00:07:56.865503 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:00.865492972 +0000 UTC m=+89.465591412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.865689 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.876334 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.885914 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.895366 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.908620 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.929686 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:56Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961300 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961311 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961326 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:56 crc kubenswrapper[4983]: I0316 00:07:56.961334 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:56Z","lastTransitionTime":"2026-03-16T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064282 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064325 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064337 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064354 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.064367 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.091925 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.091976 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.091985 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.092051 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092090 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092188 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092285 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:57 crc kubenswrapper[4983]: E0316 00:07:57.092370 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167574 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167614 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167628 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.167690 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.273965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274035 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274060 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274082 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.274095 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377571 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377589 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377610 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.377624 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479393 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479448 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479465 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479490 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.479507 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.517402 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87" exitCode=0 Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.517446 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.530172 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.549876 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.559972 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.569216 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.580027 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582438 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582483 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.582492 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.591636 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.615657 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.627098 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.638782 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.650494 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.664611 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.677948 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.690956 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691038 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691063 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.691082 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.695098 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.706281 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:57Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.793733 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794259 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.794271 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897509 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897545 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897553 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:57 crc kubenswrapper[4983]: I0316 00:07:57.897579 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:57Z","lastTransitionTime":"2026-03-16T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000100 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000142 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000164 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.000173 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.102990 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103032 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103044 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103059 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.103071 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206533 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206643 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206666 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.206683 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.309914 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.309974 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.309991 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.310016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.310033 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413064 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413163 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413266 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.413290 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516815 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516890 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516914 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.516932 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.527748 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.531990 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662" exitCode=0 Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.532048 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.553238 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.575026 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.592091 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.606020 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619136 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619217 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619165 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619473 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.619500 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.629781 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.657449 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.673363 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.691490 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.705109 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.720601 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721572 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721617 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721629 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721647 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.721659 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.732291 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.750666 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.767384 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:58Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824087 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824125 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824134 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.824159 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927149 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:58 crc kubenswrapper[4983]: I0316 00:07:58.927159 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:58Z","lastTransitionTime":"2026-03-16T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029477 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029531 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.029561 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091856 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091878 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091971 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.091982 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092132 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092711 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092849 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:07:59 crc kubenswrapper[4983]: E0316 00:07:59.092981 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132346 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132410 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132429 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132455 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.132472 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.235984 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338669 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338687 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.338734 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.441935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442034 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442052 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442079 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.442096 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.542020 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.551713 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.552387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.552730 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.553551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.553946 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.562559 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.579355 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.599446 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.616231 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.627720 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.639119 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.658970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659055 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659071 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659099 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.659115 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.660174 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.673841 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.686373 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.698488 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.711557 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.723809 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.736930 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.752408 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:07:59Z is after 2025-08-24T17:21:41Z" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761727 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761772 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.761794 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864664 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864677 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864695 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.864709 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966947 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966977 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966985 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.966999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:07:59 crc kubenswrapper[4983]: I0316 00:07:59.967008 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:07:59Z","lastTransitionTime":"2026-03-16T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069053 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069061 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069075 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.069084 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172072 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172152 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172181 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.172203 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273964 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273973 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.273985 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.274003 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377186 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377288 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377836 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.377918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.378199 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481300 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481320 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481343 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.481360 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.554237 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb" exitCode=0 Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.554338 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.561954 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.562918 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.562957 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.563081 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.572237 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588041 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588085 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588097 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.588127 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.589946 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.596548 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.598349 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.602201 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.614280 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.626653 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.639086 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.657187 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.669151 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.687121 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695631 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695668 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695682 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695701 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.695717 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.700294 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.712972 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.727252 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.736251 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.748043 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.758375 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.769414 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.781115 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.797234 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802602 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802635 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802660 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.802671 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808014 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808172 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808186 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808168087 +0000 UTC m=+97.408266517 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808211 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.808251 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808378 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808419 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808434 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808451 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808380 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808487 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808467527 +0000 UTC m=+97.408566037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808533 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808500448 +0000 UTC m=+97.408599018 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.808551 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.808545989 +0000 UTC m=+97.408644419 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.813015 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.826579 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.837145 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.845635 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.854825 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.865653 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.876291 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.884804 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.892084 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.904969 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.904999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.905008 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.905022 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.905031 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:00Z","lastTransitionTime":"2026-03-16T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.907268 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:00Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.909625 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:00 crc kubenswrapper[4983]: I0316 00:08:00.909653 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909743 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909790 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909821 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909835 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909801 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.909788492 +0000 UTC m=+97.509886922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:00 crc kubenswrapper[4983]: E0316 00:08:00.909909 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:08.909889195 +0000 UTC m=+97.509987675 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007911 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007947 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007956 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007971 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.007980 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092337 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092427 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092485 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092564 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092723 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.092789 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092854 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:01 crc kubenswrapper[4983]: E0316 00:08:01.092911 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111716 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111810 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111838 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111866 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.111889 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213696 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213741 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213768 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213784 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.213797 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316083 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316113 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.316129 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418851 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418903 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418915 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.418946 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522135 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522208 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.522224 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.568681 4983 generic.go:334] "Generic (PLEG): container finished" podID="9138d88d-b777-4cab-b3d2-2099f01b205b" containerID="f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537" exitCode=0 Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.568718 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerDied","Data":"f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.588748 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.606992 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624038 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624051 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624066 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.624079 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.628556 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.646623 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.667220 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.680108 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.693417 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.705636 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.722237 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727454 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.727482 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.731785 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.743018 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.756363 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.766896 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.786028 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:01Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829174 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.829220 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931785 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931843 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931854 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.931871 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:01 crc kubenswrapper[4983]: I0316 00:08:01.932184 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:01Z","lastTransitionTime":"2026-03-16T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033803 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033871 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033885 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.033893 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078583 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078626 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.078665 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.091951 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095161 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095203 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095219 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.095230 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.102069 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.102618 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.107008 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109844 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109881 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109893 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109909 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.109920 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.112994 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.120139 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122905 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122946 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122961 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.122975 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.123226 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.135995 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.140085 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143463 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143495 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143523 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.143538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.153874 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.155335 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: E0316 00:08:02.155480 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156711 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156722 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.156746 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.167968 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.192654 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.201199 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.211171 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.224803 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.236246 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.247225 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.256176 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.258935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.258961 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.258987 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.259004 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.259015 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.272092 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361530 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361564 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361574 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361586 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.361594 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464404 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464449 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.464485 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567800 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567858 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.567867 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.577488 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" event={"ID":"9138d88d-b777-4cab-b3d2-2099f01b205b","Type":"ContainerStarted","Data":"71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.591257 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.602079 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.612059 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.628771 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.640044 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.651700 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.662031 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671487 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671811 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671823 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671840 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.671851 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.682029 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.692727 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.700660 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.709258 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.721115 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.734425 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.754022 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.773965 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774027 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774049 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.774102 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876794 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876843 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876855 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876873 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.876887 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.979404 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.979705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.979897 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.980080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:02 crc kubenswrapper[4983]: I0316 00:08:02.980248 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:02Z","lastTransitionTime":"2026-03-16T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.082644 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.082908 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.082997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.083120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.083199 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092077 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092145 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092295 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092355 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.092330 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092477 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092536 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:03 crc kubenswrapper[4983]: E0316 00:08:03.092582 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186497 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186505 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186522 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.186533 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288434 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288492 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288510 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288535 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.288552 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391091 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391577 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.391944 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495252 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495294 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495311 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.495322 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.581912 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/0.log" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.584459 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4" exitCode=1 Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.584491 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.585090 4983 scope.go:117] "RemoveContainer" containerID="ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599221 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.599244 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.601545 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.614977 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.629223 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.647437 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.661133 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.682219 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.695114 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702419 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702502 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.702517 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.707998 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.719745 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.735440 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.750073 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.761149 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.779020 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721307 6856 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:02.721365 6856 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:02.721399 6856 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721894 6856 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:02.721941 6856 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:02.721950 6856 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:02.721975 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:02.721986 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:02.721991 6856 factory.go:656] Stopping watch factory\\\\nI0316 00:08:02.721998 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:02.722007 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:02.722026 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.790898 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806094 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:03Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806265 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806277 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806292 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.806305 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909504 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909541 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909552 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909568 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:03 crc kubenswrapper[4983]: I0316 00:08:03.909579 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:03Z","lastTransitionTime":"2026-03-16T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012294 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012380 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012389 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012403 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.012411 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115373 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115409 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115422 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.115430 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218606 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218633 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218640 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218652 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.218661 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320585 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320617 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320625 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320638 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.320647 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423056 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423095 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.423130 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525809 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525850 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525859 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525876 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.525887 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.589266 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/0.log" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.591330 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.591893 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.627963 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628032 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.628045 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.729967 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.729991 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.729999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.730013 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.730021 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832051 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832083 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832092 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.832117 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935061 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.935176 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:04Z","lastTransitionTime":"2026-03-16T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.967784 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:04 crc kubenswrapper[4983]: I0316 00:08:04.983693 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.005985 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.021730 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.034603 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037509 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037767 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037796 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.037832 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.049378 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.067476 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721307 6856 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:02.721365 6856 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:02.721399 6856 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721894 6856 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:02.721941 6856 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:02.721950 6856 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:02.721975 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:02.721986 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:02.721991 6856 factory.go:656] Stopping watch factory\\\\nI0316 00:08:02.721998 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:02.722007 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:02.722026 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.081218 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.091924 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092031 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092096 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092031 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.092187 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.092252 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.092566 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.092675 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.093062 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.103182 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.103720 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.104978 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.108447 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.119090 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140128 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140313 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140546 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.140646 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.141486 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.157065 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.168344 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.183910 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242908 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242934 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.242945 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344924 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344966 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344978 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.344992 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.345002 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448410 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448457 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.448490 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551101 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551142 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551157 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.551166 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.596808 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.597534 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/0.log" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.600621 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" exitCode=1 Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.601310 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.601527 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.602266 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:05 crc kubenswrapper[4983]: E0316 00:08:05.602404 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.602451 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.602485 4983 scope.go:117] "RemoveContainer" containerID="ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.617111 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.643163 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653788 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653805 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653827 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.653843 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.657345 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.671746 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.689102 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.700593 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.710278 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.719349 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.730073 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.750118 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab4ddd066c3ae01ad96b5a683b146b4670c8eb2bce488cb937b00a9b82614ec4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"message\\\":\\\"io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721307 6856 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:02.721365 6856 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:02.721399 6856 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:02.721894 6856 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:02.721941 6856 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:02.721950 6856 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:02.721975 6856 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:02.721986 6856 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:02.721991 6856 factory.go:656] Stopping watch factory\\\\nI0316 00:08:02.721998 6856 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:02.722007 6856 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:02.722026 6856 ovnkube.go:599] Stopped ovnkube\\\\nI0316 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756441 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756472 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.756510 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.763851 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.777726 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.791070 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.804371 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.815678 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.831452 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:05Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858845 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858874 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858885 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.858912 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.960957 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.960996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.961007 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.961028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:05 crc kubenswrapper[4983]: I0316 00:08:05.961042 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:05Z","lastTransitionTime":"2026-03-16T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063043 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063117 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063136 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063171 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.063205 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165501 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165543 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165569 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.165580 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268056 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.268081 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370564 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370589 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370620 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.370644 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472894 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472934 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472942 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.472963 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575579 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575666 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575695 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.575717 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.605537 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.609673 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:06 crc kubenswrapper[4983]: E0316 00:08:06.610004 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.622842 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.634097 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.647178 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.659898 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.674265 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678468 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678530 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.678560 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.685835 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.705676 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.718031 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.731211 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.743279 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.753885 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.763186 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.777854 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781484 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781525 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781537 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781580 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.781593 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.790427 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.804271 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.819559 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:06Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883741 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883829 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883845 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.883856 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.986374 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.986797 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.987471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.987796 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:06 crc kubenswrapper[4983]: I0316 00:08:06.988021 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:06Z","lastTransitionTime":"2026-03-16T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090490 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090538 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090547 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.090577 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091652 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091649 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091796 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.091946 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.091938 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.092176 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.092239 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:07 crc kubenswrapper[4983]: E0316 00:08:07.092348 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193902 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193955 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193969 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.193991 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.194007 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297249 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297278 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.297291 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399437 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.399517 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.501957 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.501993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.502003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.502019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.502031 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.605956 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606792 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.606945 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710578 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710596 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.710641 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812817 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812853 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812862 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812883 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.812899 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915861 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915984 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.915998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:07 crc kubenswrapper[4983]: I0316 00:08:07.916007 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:07Z","lastTransitionTime":"2026-03-16T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018377 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018413 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018434 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.018448 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120356 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120396 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120407 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120421 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.120431 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222377 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222428 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222443 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222466 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.222477 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324740 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324839 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324857 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324882 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.324900 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427012 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427092 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.427143 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529837 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529916 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.529930 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632102 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632139 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632148 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632163 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.632176 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.687175 4983 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.734327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735201 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735292 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.735381 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.837609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838009 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838198 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838408 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.838540 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890323 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890407 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890441 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.890471 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890581 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890609 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890629 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.890605876 +0000 UTC m=+113.490704306 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890643 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890669 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890674 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.890666178 +0000 UTC m=+113.490764608 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890742 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.89072 +0000 UTC m=+113.490818460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.890744 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.891015 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.890919856 +0000 UTC m=+113.491018346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941181 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941219 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.941253 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:08Z","lastTransitionTime":"2026-03-16T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.992356 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:08 crc kubenswrapper[4983]: I0316 00:08:08.992457 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992629 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992705 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992729 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992653 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992838 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.99281224 +0000 UTC m=+113.592910710 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:08 crc kubenswrapper[4983]: E0316 00:08:08.992868 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:24.992855331 +0000 UTC m=+113.592953791 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044416 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044454 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044478 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.044488 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092107 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092142 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092107 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.092124 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092236 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092367 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092440 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:09 crc kubenswrapper[4983]: E0316 00:08:09.092480 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146652 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146693 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146703 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146721 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.146730 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249863 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249911 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249927 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.249939 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353876 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353954 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.353983 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.354007 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.457853 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.458561 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.458691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.458852 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.459269 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563359 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563377 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563403 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.563422 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666516 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666582 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666632 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.666653 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.769923 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.769999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.770025 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.770052 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.770069 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873561 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873639 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873660 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.873713 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986061 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986170 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:09 crc kubenswrapper[4983]: I0316 00:08:09.986196 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:09Z","lastTransitionTime":"2026-03-16T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089797 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089815 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089837 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.089854 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192521 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192694 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192722 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192791 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.192826 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.295924 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296322 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296433 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.296536 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.399954 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400307 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400467 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400622 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.400802 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.504937 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505102 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.505150 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608146 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608210 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608253 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.608271 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711584 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711606 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.711623 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.813993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.814945 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.815000 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.815031 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.815052 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918200 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918267 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:10 crc kubenswrapper[4983]: I0316 00:08:10.918340 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:10Z","lastTransitionTime":"2026-03-16T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021365 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021445 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021470 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.021486 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.091932 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.091971 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.092013 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092089 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.092126 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092249 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092397 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:11 crc kubenswrapper[4983]: E0316 00:08:11.092516 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.123433 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.225485 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327220 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327256 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.327295 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429129 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429175 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429188 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.429220 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.531875 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.531948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.531970 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.532001 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.532022 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633849 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633890 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633901 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633914 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.633924 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737029 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737069 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737080 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737100 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.737112 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839547 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839630 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.839673 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942488 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942569 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:11 crc kubenswrapper[4983]: I0316 00:08:11.942613 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:11Z","lastTransitionTime":"2026-03-16T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045295 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045358 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.045411 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.111116 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.131485 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.148952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149011 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149082 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.149100 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.155362 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.182044 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.203602 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.222638 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.244840 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251579 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251644 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251691 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.251710 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253347 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253522 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253628 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253742 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.253863 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.263202 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.272481 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277355 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277415 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277431 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.277442 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.279399 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.294803 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.298736 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304335 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304359 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.304376 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.312869 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.323312 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.326548 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328331 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328348 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.328388 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.341180 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.346175 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.350904 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.350996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.351016 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.351042 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.351098 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.356728 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.371567 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: E0316 00:08:12.372381 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375091 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375138 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375515 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.375913 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.376023 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.405832 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.480824 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.480932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.480952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.481014 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.481033 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583679 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583857 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583915 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.583994 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687215 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687277 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687295 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.687337 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791315 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791386 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791408 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791436 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.791457 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894640 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894717 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894739 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894816 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.894845 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998614 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998676 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998689 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:12 crc kubenswrapper[4983]: I0316 00:08:12.998738 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:12Z","lastTransitionTime":"2026-03-16T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092275 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092338 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092339 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.092356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.092493 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.092709 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.092881 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:13 crc kubenswrapper[4983]: E0316 00:08:13.093162 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.101894 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.101944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.101998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.102021 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.102035 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205082 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205123 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205133 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205149 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.205159 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308612 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308665 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308702 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.308719 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410770 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410781 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410798 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.410812 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516364 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516419 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516436 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.516478 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618722 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618777 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618786 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618802 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.618811 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722097 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722174 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722196 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.722247 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825430 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.825495 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928425 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928491 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:13 crc kubenswrapper[4983]: I0316 00:08:13.928518 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:13Z","lastTransitionTime":"2026-03-16T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031064 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031229 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.031248 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.133961 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134025 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134043 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134068 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.134085 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237500 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237546 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237569 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.237578 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.339992 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340036 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340053 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.340080 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443064 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443131 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443154 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.443206 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546266 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.546280 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648793 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.648834 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.693423 4983 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.752174 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855199 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855286 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855315 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.855334 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.958898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.958950 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.958989 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.959015 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:14 crc kubenswrapper[4983]: I0316 00:08:14.959032 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:14Z","lastTransitionTime":"2026-03-16T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061730 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061802 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061815 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.061849 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092385 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092430 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092468 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092569 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.092585 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092667 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092738 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:15 crc kubenswrapper[4983]: E0316 00:08:15.092804 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164230 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164270 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164305 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.164318 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267069 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267105 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.267143 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369346 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369398 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369413 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369430 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.369446 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472056 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472068 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.472079 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573814 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573868 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573892 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.573901 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676620 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676735 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676787 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676822 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.676850 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779659 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779714 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779726 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779745 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.779779 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883255 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883336 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883351 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883378 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.883397 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.986355 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987072 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987132 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:15 crc kubenswrapper[4983]: I0316 00:08:15.987190 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:15Z","lastTransitionTime":"2026-03-16T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089384 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089496 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.089515 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192638 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.192662 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.295480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.295834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.295982 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.296115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.296237 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399743 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399840 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.399907 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502619 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502633 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.502668 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604912 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604960 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604971 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604987 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.604998 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.707709 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708119 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708236 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.708397 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810773 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810841 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810887 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.810906 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.913927 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.913997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.914022 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.914052 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:16 crc kubenswrapper[4983]: I0316 00:08:16.914075 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:16Z","lastTransitionTime":"2026-03-16T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018170 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018260 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018284 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.018299 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092463 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092519 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092574 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.092691 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.092923 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.093508 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.093702 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:17 crc kubenswrapper[4983]: E0316 00:08:17.093958 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.094110 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120869 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.120895 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224187 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224620 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224672 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.224692 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.327948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.327988 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.327999 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.328017 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.328028 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430475 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430530 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.430579 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533446 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533497 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533535 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533558 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.533570 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635528 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635539 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635554 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.635567 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.649668 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.652181 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.652718 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.663613 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.678495 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.696541 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.708313 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.717142 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.728478 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.737980 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738043 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.738053 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.739097 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.751475 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.760638 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.777603 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.787570 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.799350 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.814602 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.824773 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840251 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840307 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840324 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840342 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.840354 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.842125 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.857617 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:17Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943471 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943582 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943600 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:17 crc kubenswrapper[4983]: I0316 00:08:17.943612 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:17Z","lastTransitionTime":"2026-03-16T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.046923 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.046985 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.047003 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.047028 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.047047 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.093445 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.110375 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150019 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150059 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150069 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150085 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.150102 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252469 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.252480 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355100 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355116 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.355126 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458851 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458928 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458951 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.458983 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.459005 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562252 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562318 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562340 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.562356 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.658590 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.661603 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.662261 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664042 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664352 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664426 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664483 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.664509 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.665043 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/1.log" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.668308 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" exitCode=1 Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.668380 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.668448 4983 scope.go:117] "RemoveContainer" containerID="b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.669299 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:18 crc kubenswrapper[4983]: E0316 00:08:18.669537 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.701817 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.716301 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.735326 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.747560 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.758852 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768578 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768648 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768662 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768678 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.768691 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.773659 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.789320 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.818195 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.833508 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.850327 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.866236 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871172 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871210 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871222 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871240 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.871251 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.880677 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.897937 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.913377 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.930370 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.948915 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.964965 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973604 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973688 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973706 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973723 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.973736 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:18Z","lastTransitionTime":"2026-03-16T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.979539 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:18 crc kubenswrapper[4983]: I0316 00:08:18.994878 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:18Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.024903 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.043312 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.061712 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076614 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076676 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076690 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.076731 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.077395 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092393 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092436 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092523 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092551 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.092581 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092728 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092864 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.092940 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.097224 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.111928 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.128415 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.143374 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.155819 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.170971 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179407 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179451 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179460 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.179487 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.185186 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.202075 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.227480 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.246030 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.273104 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5bc83190a4ad87bafd6bbe58d28b94723a928f079b282e8293ee0d848a43ecc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:05Z\\\",\\\"message\\\":\\\".122358 7021 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0316 00:08:05.122886 7021 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123122 7021 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:05.123531 7021 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123667 7021 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.123871 7021 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0316 00:08:05.124278 7021 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0316 00:08:05.124349 7021 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:05.124393 7021 factory.go:656] Stopping watch factory\\\\nI0316 00:08:05.124402 7021 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:05.124412 7021 ovnkube.go:599] Stopped ovnkube\\\\nI0316 00:08:0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283851 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283876 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.283907 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387086 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387144 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387160 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.387201 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490148 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.490214 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.592864 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593121 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593270 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.593334 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.674179 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.679858 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:19 crc kubenswrapper[4983]: E0316 00:08:19.680115 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.694291 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696379 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696540 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696605 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.696626 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.715179 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.732049 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.743696 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.757741 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.788702 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799198 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.799219 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.808522 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.821050 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.832987 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.842717 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.861496 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.873105 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.885402 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.901020 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902313 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902343 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902366 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902386 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.902398 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:19Z","lastTransitionTime":"2026-03-16T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.920506 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.930870 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:19 crc kubenswrapper[4983]: I0316 00:08:19.945732 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:19Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004072 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004101 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004109 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004123 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.004133 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.106572 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107270 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107379 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.107573 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209856 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209882 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209913 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.209936 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.312937 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313468 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313615 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.313809 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416577 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416680 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.416698 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.520220 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.520637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.520896 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.521108 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.521266 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624421 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624487 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624533 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.624550 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727699 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727744 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727781 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727800 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.727812 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830873 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830906 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830940 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830958 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.830970 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933570 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933641 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933658 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:20 crc kubenswrapper[4983]: I0316 00:08:20.933723 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:20Z","lastTransitionTime":"2026-03-16T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.036978 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037036 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037062 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.037071 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.091735 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.091798 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.091941 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092070 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.092129 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092286 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092504 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:21 crc kubenswrapper[4983]: E0316 00:08:21.092604 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.139805 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140518 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140636 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140747 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.140892 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.244598 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245011 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245169 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.245463 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348126 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348194 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348218 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.348235 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.451326 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.451736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.451989 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.452306 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.452506 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556067 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556508 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556744 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.556980 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.557117 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661155 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661171 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.661216 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765747 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765852 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765868 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.765905 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868265 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868273 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868286 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.868294 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971745 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971773 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.971792 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:21 crc kubenswrapper[4983]: I0316 00:08:21.972164 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:21Z","lastTransitionTime":"2026-03-16T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074134 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074162 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074170 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.074191 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.105154 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.121190 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.136936 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.162301 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176613 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176653 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176667 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176684 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.176697 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.178513 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.192697 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.203282 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.213061 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.223387 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.233973 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.250417 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.260236 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.270998 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278278 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278323 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278338 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278360 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.278376 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.281857 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.293171 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.302900 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.315808 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381089 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381151 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381167 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381189 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.381206 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484777 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484830 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484846 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.484875 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587643 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587732 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587789 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587817 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.587835 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689650 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689749 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689829 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689855 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.689873 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.754319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.754743 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.755032 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.755302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.755538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.770448 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776131 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776192 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776216 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776244 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.776266 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.793110 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797352 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797416 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797467 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.797489 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.810364 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815536 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815551 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815577 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.815592 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.828568 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833202 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833256 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833280 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.833300 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.847636 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:22Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:22 crc kubenswrapper[4983]: E0316 00:08:22.847888 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850053 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850103 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.850156 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952324 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952422 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:22 crc kubenswrapper[4983]: I0316 00:08:22.952531 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:22Z","lastTransitionTime":"2026-03-16T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.056922 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057445 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057482 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.057508 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091689 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091714 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091811 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092192 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092319 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.091841 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092534 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:23 crc kubenswrapper[4983]: E0316 00:08:23.092578 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.161737 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.162140 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.162319 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.162514 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.163444 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266354 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266395 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266407 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266423 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.266434 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369714 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369819 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369839 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369865 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.369883 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472321 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472635 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472699 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.472785 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575835 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575887 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575904 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575940 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.575954 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678243 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678285 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.678305 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781375 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781439 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781463 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781493 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.781516 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885115 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885172 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885189 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885213 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.885231 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988531 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988592 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988632 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:23 crc kubenswrapper[4983]: I0316 00:08:23.988648 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:23Z","lastTransitionTime":"2026-03-16T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091541 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091609 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091633 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091656 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.091675 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.194659 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195135 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195157 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195185 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.195202 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298046 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298114 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298130 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298156 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.298174 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.400903 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.400976 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.401002 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.401035 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.401057 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505164 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505223 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505244 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.505259 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608693 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608785 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608803 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608832 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.608849 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.711948 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.711989 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.711997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.712013 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.712022 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814532 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814689 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814712 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814734 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.814751 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917598 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917676 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917694 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917723 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.917745 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:24Z","lastTransitionTime":"2026-03-16T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970194 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970309 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970390 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:24 crc kubenswrapper[4983]: I0316 00:08:24.970425 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970517 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970489238 +0000 UTC m=+145.570587678 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970590 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970616 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970676 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970697 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970599 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970698 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970672514 +0000 UTC m=+145.570770984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970830 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970805378 +0000 UTC m=+145.570903888 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:24 crc kubenswrapper[4983]: E0316 00:08:24.970856 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:56.970842489 +0000 UTC m=+145.570941049 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020719 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020810 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020831 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020905 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.020931 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.071603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.071683 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071852 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071902 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071920 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.071927 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.072026 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:57.072002039 +0000 UTC m=+145.672100509 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.072465 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:08:57.072374721 +0000 UTC m=+145.672473171 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092562 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092627 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092671 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.092671 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093049 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093094 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093172 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:25 crc kubenswrapper[4983]: E0316 00:08:25.093252 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124420 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124496 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124520 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.124540 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227364 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227375 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227393 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.227406 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330371 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330409 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.330425 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.432931 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433030 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433068 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433098 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.433118 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536438 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536456 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536481 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.536499 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639543 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639567 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639598 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.639621 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742107 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742173 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742217 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.742235 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.844932 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.844983 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.844997 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.845018 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.845031 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948429 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948467 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948490 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:25 crc kubenswrapper[4983]: I0316 00:08:25.948499 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:25Z","lastTransitionTime":"2026-03-16T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051206 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051279 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.051292 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153679 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153725 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153766 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.153778 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256360 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256394 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256401 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256416 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.256424 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359175 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359268 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359294 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359321 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.359338 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462031 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462081 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462093 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462110 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.462123 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564147 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564195 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564208 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564225 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.564237 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666840 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666890 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666901 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666919 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.666931 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769877 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769902 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769925 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.769940 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872547 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872586 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872597 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872613 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.872625 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974820 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974878 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974898 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974924 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:26 crc kubenswrapper[4983]: I0316 00:08:26.974943 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:26Z","lastTransitionTime":"2026-03-16T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.077998 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078074 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078092 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078118 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.078137 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092559 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092668 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.092557 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.092730 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.092888 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.093150 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:27 crc kubenswrapper[4983]: E0316 00:08:27.093235 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181672 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181736 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181782 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181811 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.181828 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284301 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284310 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284329 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.284340 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386523 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.386537 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489734 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489791 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489800 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489821 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.489831 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591792 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591825 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591839 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.591848 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694158 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694205 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694221 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694238 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.694250 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796872 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796920 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796928 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796943 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.796953 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900390 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900457 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900474 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900500 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:27 crc kubenswrapper[4983]: I0316 00:08:27.900518 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:27Z","lastTransitionTime":"2026-03-16T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003448 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003524 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003544 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003572 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.003592 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106610 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106650 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106688 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.106712 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209694 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209793 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209811 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.209855 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312381 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312450 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312469 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312488 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.312500 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415368 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415421 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415437 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415461 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.415478 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519196 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519279 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519301 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.519347 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.621906 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.621978 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.622008 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.622041 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.622061 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724855 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724935 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724962 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.724981 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828088 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828143 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828166 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.828215 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930819 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930868 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930884 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930909 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:28 crc kubenswrapper[4983]: I0316 00:08:28.930926 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:28Z","lastTransitionTime":"2026-03-16T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034667 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034738 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034803 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034834 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.034857 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092110 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.092422 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092193 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.092654 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092225 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.092136 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.092901 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:29 crc kubenswrapper[4983]: E0316 00:08:29.093081 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137464 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137516 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137528 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137545 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.137559 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240078 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240138 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240154 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.240195 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342546 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342611 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342631 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342655 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.342675 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446050 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446104 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446141 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.446158 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548522 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548588 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548605 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548631 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.548727 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.650952 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.650993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.651005 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.651020 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.651030 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753574 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753663 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753693 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.753715 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856591 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856642 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856659 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856683 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.856702 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959688 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959748 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959817 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959849 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:29 crc kubenswrapper[4983]: I0316 00:08:29.959871 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:29Z","lastTransitionTime":"2026-03-16T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062837 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062860 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.062876 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165612 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165623 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165640 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.165653 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269376 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269419 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269442 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.269451 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371548 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371587 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371595 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371611 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.371620 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.474964 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475118 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475149 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475226 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.475253 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578106 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578183 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578197 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578250 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.578268 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.681716 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682071 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682198 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.682553 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.785575 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.786036 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.786239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.786993 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.787200 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890715 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890783 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890795 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890812 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.890823 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994492 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994545 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994557 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994576 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:30 crc kubenswrapper[4983]: I0316 00:08:30.994591 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:30Z","lastTransitionTime":"2026-03-16T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.091929 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.091985 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.092050 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092112 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.092132 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092253 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092393 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:31 crc kubenswrapper[4983]: E0316 00:08:31.092482 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097601 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097749 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097910 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.097928 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.201889 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202261 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202463 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202666 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.202938 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.306903 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307224 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307409 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307552 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.307678 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410020 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410090 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410113 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410144 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.410166 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513341 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513397 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513414 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513440 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.513459 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.616542 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.616967 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.617182 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.617412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.617606 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720382 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720462 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720485 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720517 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.720538 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.823818 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824057 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824120 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824242 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.824338 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.926645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.926918 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.926996 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.927075 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:31 crc kubenswrapper[4983]: I0316 00:08:31.927152 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:31Z","lastTransitionTime":"2026-03-16T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029231 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029271 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029282 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029301 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.029323 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:32Z","lastTransitionTime":"2026-03-16T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.095995 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:32 crc kubenswrapper[4983]: E0316 00:08:32.096595 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.110863 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.111704 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.127577 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: E0316 00:08:32.129803 4983 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.146690 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.166122 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.179969 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: E0316 00:08:32.186196 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.206075 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.220365 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.232308 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.244287 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.256801 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.274329 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.286714 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.298533 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.312339 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.327980 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.341245 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:32 crc kubenswrapper[4983]: I0316 00:08:32.355569 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:32Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080178 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080226 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080242 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080258 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.080268 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092127 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092127 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092156 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092637 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092369 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.092162 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092663 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.093289 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.092692 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096566 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096608 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096624 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096645 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.096661 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.107942 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111150 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111184 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111193 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111207 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.111216 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.121894 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125358 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125451 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125465 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125479 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.125491 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.139900 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143432 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143476 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143489 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143507 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:33 crc kubenswrapper[4983]: I0316 00:08:33.143520 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:33Z","lastTransitionTime":"2026-03-16T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.162780 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:33Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:33 crc kubenswrapper[4983]: E0316 00:08:33.162936 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091790 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.092743 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091862 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.093047 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091834 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.093302 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:35 crc kubenswrapper[4983]: I0316 00:08:35.091936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:35 crc kubenswrapper[4983]: E0316 00:08:35.093538 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.309794 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.324986 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.341406 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.356315 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.373635 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.396882 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.439524 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.457772 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.477407 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.505490 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.523137 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.540698 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.558509 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.579876 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.599628 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.622411 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.634105 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.656201 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:36 crc kubenswrapper[4983]: I0316 00:08:36.670536 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:36Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092130 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092248 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092135 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092333 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092434 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:37 crc kubenswrapper[4983]: I0316 00:08:37.092562 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092643 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.092848 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:37 crc kubenswrapper[4983]: E0316 00:08:37.187960 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091639 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.091868 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091647 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091645 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.092030 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:39 crc kubenswrapper[4983]: I0316 00:08:39.091664 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.092183 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:39 crc kubenswrapper[4983]: E0316 00:08:39.092286 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.106806 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.754025 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/0.log" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.754412 4983 generic.go:334] "Generic (PLEG): container finished" podID="f81ec143-6c51-4f96-ae71-a4759bac7c70" containerID="05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7" exitCode=1 Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.754485 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerDied","Data":"05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7"} Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.755369 4983 scope.go:117] "RemoveContainer" containerID="05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.770114 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.786189 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.806085 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.825301 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.837619 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.848846 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.861431 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.874479 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.891183 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.907613 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.927435 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.942260 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.958211 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.970267 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:40 crc kubenswrapper[4983]: I0316 00:08:40.984878 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:40.999999 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:40Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.011510 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.025205 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.035455 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.091951 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.091951 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.091964 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.092020 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092111 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092169 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092216 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:41 crc kubenswrapper[4983]: E0316 00:08:41.092260 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.760174 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/0.log" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.760253 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9"} Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.777660 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.791633 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.803986 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.823935 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.840215 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.855384 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.868466 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.881249 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.895279 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.906855 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.919042 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.932375 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.941885 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.950855 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.962497 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.975982 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:41 crc kubenswrapper[4983]: I0316 00:08:41.988960 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:41Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.009142 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.027802 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.104674 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.118035 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.133491 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.149566 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.162308 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.175927 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: E0316 00:08:42.188470 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.189279 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.206530 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.217926 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.230686 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.244192 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.261108 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.272958 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.286797 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.298049 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.308316 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.320666 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.340742 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:42 crc kubenswrapper[4983]: I0316 00:08:42.361448 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:42Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092587 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092682 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092706 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.092815 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.093001 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.093173 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.092612 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.093391 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403615 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403665 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403692 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403709 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.403720 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.417724 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422239 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422287 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422302 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422321 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.422335 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.437373 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442357 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442412 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442429 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442451 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.442466 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.456834 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461233 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461291 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461306 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461327 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.461344 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.478232 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491534 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491581 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491596 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491618 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:43 crc kubenswrapper[4983]: I0316 00:08:43.491635 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:43Z","lastTransitionTime":"2026-03-16T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.561907 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:43Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:43 crc kubenswrapper[4983]: E0316 00:08:43.562081 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.092356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.092367 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.092461 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.092521 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.093091 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.093262 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.093307 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:45 crc kubenswrapper[4983]: E0316 00:08:45.093432 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.093832 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.774785 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.777985 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.778484 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.791153 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.801790 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.812041 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.823895 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.834503 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.851050 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.863312 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.871524 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.882154 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.892070 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.899699 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.907902 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.918392 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.931203 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.950735 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.961999 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.972824 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.982975 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:45 crc kubenswrapper[4983]: I0316 00:08:45.999212 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:45Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.784837 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.786341 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/2.log" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.791497 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" exitCode=1 Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.791564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.791621 4983 scope.go:117] "RemoveContainer" containerID="4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.792808 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:08:46 crc kubenswrapper[4983]: E0316 00:08:46.793141 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.814522 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.832868 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.851000 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.870517 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.888814 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.914045 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.928048 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.938547 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.954550 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.981479 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:46 crc kubenswrapper[4983]: I0316 00:08:46.993230 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.005025 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.018454 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.034362 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.064900 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.081068 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091861 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091915 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091951 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.091936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092039 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092165 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092272 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.092322 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.100705 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.113208 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.129563 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cf7f4f3944262dd843e40017e6456b5f5e0ff5f12825b2ec5e9f448f1f69b5a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:18Z\\\",\\\"message\\\":\\\"r removal\\\\nI0316 00:08:18.006531 7202 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0316 00:08:18.006545 7202 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0316 00:08:18.006586 7202 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0316 00:08:18.007476 7202 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:18.007488 7202 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0316 00:08:18.007518 7202 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:18.007550 7202 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0316 00:08:18.007565 7202 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0316 00:08:18.007556 7202 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0316 00:08:18.007603 7202 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:18.007702 7202 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:18.007615 7202 handler.go:208] Removed *v1.Node event handler 7\\\\nI0316 00:08:18.007804 7202 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:18.007839 7202 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0316 00:08:18.007901 7202 factory.go:656] Stopping watch factory\\\\nI0316 00:08:18.007941 7202 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.190173 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.798572 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.805027 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:08:47 crc kubenswrapper[4983]: E0316 00:08:47.805837 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.821858 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.839111 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.858815 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.877612 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.893497 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.912221 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.939263 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.953539 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.969212 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.982006 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:47 crc kubenswrapper[4983]: I0316 00:08:47.994501 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.004746 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.026850 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.039997 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.052571 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.064899 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.076254 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.087486 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:48 crc kubenswrapper[4983]: I0316 00:08:48.103835 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092563 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092637 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092688 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:49 crc kubenswrapper[4983]: I0316 00:08:49.092588 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.092815 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.092933 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.093178 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:49 crc kubenswrapper[4983]: E0316 00:08:49.093276 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092104 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092164 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092212 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:51 crc kubenswrapper[4983]: I0316 00:08:51.092266 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092402 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092516 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092626 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:51 crc kubenswrapper[4983]: E0316 00:08:51.092819 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.111951 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.134209 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.151243 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.168474 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.185970 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: E0316 00:08:52.190943 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.210923 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.233225 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.269869 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.302192 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.321554 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.342008 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.358465 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.383689 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.404348 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.433069 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.448312 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.461541 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.473465 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:52 crc kubenswrapper[4983]: I0316 00:08:52.483798 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092415 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092547 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.092636 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093133 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093288 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093405 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.093487 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747076 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747637 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747710 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747796 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.747865 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.763589 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767900 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767944 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767953 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767972 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.767982 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.786520 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790714 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790776 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790791 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790807 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.790820 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.805578 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809673 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809703 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809711 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809724 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.809734 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.827382 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.830980 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831077 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831143 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831209 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:08:53 crc kubenswrapper[4983]: I0316 00:08:53.831279 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:08:53Z","lastTransitionTime":"2026-03-16T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.847818 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 16 00:08:53 crc kubenswrapper[4983]: E0316 00:08:53.847971 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.092382 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.092493 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.092536 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.092571 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.092808 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:55 crc kubenswrapper[4983]: I0316 00:08:55.093180 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.093276 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:55 crc kubenswrapper[4983]: E0316 00:08:55.093403 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.006745 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.006943 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.00691601 +0000 UTC m=+209.607014480 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.007129 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.007205 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.007253 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007340 4983 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007383 4983 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007408 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.007391823 +0000 UTC m=+209.607490293 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007451 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.007433344 +0000 UTC m=+209.607531784 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007549 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007567 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007581 4983 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.007620 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.007608199 +0000 UTC m=+209.607706729 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092290 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092364 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092403 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.092361 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.092531 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.092645 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.092840 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.093012 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.108558 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108689 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108703 4983 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108713 4983 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108750 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.108737857 +0000 UTC m=+209.708836287 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 16 00:08:57 crc kubenswrapper[4983]: I0316 00:08:57.108924 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.108930 4983 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.109022 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs podName:6993dda4-ac10-47af-b406-d49d7781fbe5 nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.108997524 +0000 UTC m=+209.709096054 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs") pod "network-metrics-daemon-qvtjp" (UID: "6993dda4-ac10-47af-b406-d49d7781fbe5") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 16 00:08:57 crc kubenswrapper[4983]: E0316 00:08:57.192896 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092158 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092232 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092195 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:08:59 crc kubenswrapper[4983]: I0316 00:08:59.092185 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092397 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092538 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092742 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:08:59 crc kubenswrapper[4983]: E0316 00:08:59.092870 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091929 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091982 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091969 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:01 crc kubenswrapper[4983]: I0316 00:09:01.091945 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092124 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092322 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092450 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:01 crc kubenswrapper[4983]: E0316 00:09:01.092658 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.093288 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:09:02 crc kubenswrapper[4983]: E0316 00:09:02.093650 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.112712 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.128389 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.145604 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.161040 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.174192 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.188729 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: E0316 00:09:02.193405 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.224428 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.239085 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.257726 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.269362 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.281586 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.295600 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.307861 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.319114 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.330573 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.344464 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.359551 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.368911 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:02 crc kubenswrapper[4983]: I0316 00:09:02.381037 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:02Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.091934 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.091948 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.092274 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092251 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092424 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:03 crc kubenswrapper[4983]: I0316 00:09:03.092405 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092448 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:03 crc kubenswrapper[4983]: E0316 00:09:03.092669 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043402 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043459 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043475 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043499 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.043515 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.058898 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063050 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063097 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063108 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063125 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.063137 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.081652 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085153 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085223 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085238 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085264 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.085299 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.100627 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104681 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104705 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104713 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104726 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.104737 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.122513 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126264 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126317 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126340 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126367 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:04 crc kubenswrapper[4983]: I0316 00:09:04.126385 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:04Z","lastTransitionTime":"2026-03-16T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.142865 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:04Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:04 crc kubenswrapper[4983]: E0316 00:09:04.143001 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092130 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092184 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092229 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092258 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:05 crc kubenswrapper[4983]: I0316 00:09:05.092155 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092328 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092440 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:05 crc kubenswrapper[4983]: E0316 00:09:05.092478 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092290 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092385 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.092516 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092317 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.092666 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:07 crc kubenswrapper[4983]: I0316 00:09:07.092747 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.092874 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.093111 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:07 crc kubenswrapper[4983]: E0316 00:09:07.194696 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092409 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092458 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093227 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092560 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:09 crc kubenswrapper[4983]: I0316 00:09:09.092500 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093398 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093576 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:09 crc kubenswrapper[4983]: E0316 00:09:09.093686 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092406 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092457 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092478 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.092554 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:11 crc kubenswrapper[4983]: I0316 00:09:11.092575 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.092797 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.092924 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:11 crc kubenswrapper[4983]: E0316 00:09:11.093080 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.115658 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.140119 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-tqncp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f81ec143-6c51-4f96-ae71-a4759bac7c70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:40Z\\\",\\\"message\\\":\\\"2026-03-16T00:07:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4\\\\n2026-03-16T00:07:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_5b3070d2-1150-4e00-a426-605cccc1dba4 to /host/opt/cni/bin/\\\\n2026-03-16T00:07:55Z [verbose] multus-daemon started\\\\n2026-03-16T00:07:55Z [verbose] Readiness Indicator file check\\\\n2026-03-16T00:08:40Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjz9m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-tqncp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.159411 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43da17ff-aed1-44a2-a154-6800c3dd6ca9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e59fd85278b9ca1d3cfb893cc14b7324b3d6e7ef7e066567055f1122d5db5b49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78381d60ca38e15123fb0142959d652140f2303b0db7271a52e9f823b246944e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-trbxh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hjpzf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.181177 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9138d88d-b777-4cab-b3d2-2099f01b205b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71d3e3ba56ad718c5cec70239496d2890886d211e77d4e673c61bbf52c517a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee65ef436a527d49530f18d051144c1cba96fd3ba06697509fea82f83c276495\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8a973cb0a4baa087164617c0f6ca69ad9e513b7ba680b35b492e7d8505c409c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621dc51f40c3cd6f123b9a283b952175cc3f0935cfb9d1a6f76518c44f402f87\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eb52cac5ca7c464e7abca8dba0c55114ba576580f5a04c75e41a97291cf1662\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6d414b94082259b6f6a118ff385b12549a48735662c00189978f1f45c587e3bb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f354efd74345b53940b6b8ed6a8189956b5edffa73ca61c5a0c7b51fe6d1c537\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:08:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-92wtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-pp6bs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: E0316 00:09:12.195851 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.204497 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"63d1127d-8135-495c-8d7f-8fc9cedce271\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fe5fc4f2ed7157d73bff865f195e82a5173876e7dd5e7613f686269d9ab6712\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90cd9488ed3333233c84e1cbaad72b2da1973c03a4ac4ad0d975beb59cf47ac7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://00e4a3853b78677c9fdb4114c9dd7a87a09aa36e3c089edc8ae61f3c67c05ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ece8435d4da26017f8b668e271947f88db2030bb7722a54bdb6a4cd239405d10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.217471 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b470585e-e5b3-4aff-b6d0-b26e9c5cc1b9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae9b7f614bfb0002fb2aec7cf490ad7ab0b98b1db7af7b3a3bdc367419fa6c92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://356cd11c15ee65774cd01b7f21226ccc6dde67ddb2ff2db9fe130569c75ad8ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.235841 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a38880dd-7514-407a-af55-eff24eca32c5\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:08:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:33Z\\\",\\\"message\\\":\\\"le observer\\\\nW0316 00:07:32.937316 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0316 00:07:32.937434 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0316 00:07:32.938111 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2327715270/tls.crt::/tmp/serving-cert-2327715270/tls.key\\\\\\\"\\\\nI0316 00:07:33.158873 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0316 00:07:33.161276 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0316 00:07:33.161294 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0316 00:07:33.161322 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0316 00:07:33.161327 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0316 00:07:33.167380 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0316 00:07:33.167392 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0316 00:07:33.167420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167428 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0316 00:07:33.167436 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0316 00:07:33.167440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0316 00:07:33.167444 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0316 00:07:33.167449 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0316 00:07:33.168904 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:08:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.252268 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87f162b8a4f90f4e626776ed80d151e8d8cfa8821b76c55fd262991dbfce5869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.267809 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"48a48757-a3b8-4d4d-92ba-6a2459a26a86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e83a1e853209eb782b1119bfe60e8ed9e9acd1a31ddc27f0fedfd76f3c55c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w6zq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-7sbnj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.279575 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0d106a-af81-491c-81bb-8355cf9faf87\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cff708d51948d658838cbc1b6d212c1cb9745e646aa986199481d1526a1c907\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4957b09ffbf64b6ffbefe423b33dea678a8ded33471ce48a65b643d72868272\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-16T00:07:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0316 00:06:34.097544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0316 00:06:34.100693 1 observer_polling.go:159] Starting file observer\\\\nI0316 00:06:34.138126 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0316 00:06:34.143308 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0316 00:07:04.389672 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c75d8a59997bacc347e80319b694cc1b35a126a5ca63e7cafea07408afd968\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://568b0b8b96ef2c2d80e2823931bdd8f06ff0078e5cf8089a10c328f150751c75\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.308987 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f94e6bed-1df6-4a06-8181-38d1b25c8617\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:06:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://827d516965773cf4acdd36a2394076e2f875b87646822bfef3440e5002814a31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f62e2e810daec12caa7ed830b5cbe9b0103b0425bccb6351ae069737cd5d2180\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d41836cded02ac85ad3886e1750e6c061bbb820ebff13a054c1b76ec53628e32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://76ae80c4b2c87e0996648b2074d147a3d3d64d5a4ede83e9711f3fbbe9c40961\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4cc2fa159d27a452699b89d473bfffe7e717f23fccbe174955ea9319d5f84b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:06:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://327b1bee207bf97dc2040b41a397c4b1435df8a68010b15cc1470464d2e2e173\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c241732306818be89b2323a40d4437744ee67232dd7bc542e243f2d782f99230\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://621cef826895438cdd6743f238949d28356065ed8a1b74cd0589d5d896504236\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:06:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:06:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:06:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.319549 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.330802 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.340431 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v748m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6740e33-489f-4f45-b3e5-fdceaebf4301\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23dff7c7410e4e4275b3f742dfb0dd34b5a45fd3769d98ceea8b921090aad3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6g8sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v748m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.352394 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-d2h5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"607f8329-b349-45da-bb9b-785740b4ad4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915b0642fbe75e964b0b2679565298c1861ae42cd3f6dd9eea7973b5721873b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ncxpw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-d2h5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.362742 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6993dda4-ac10-47af-b406-d49d7781fbe5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l24hn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qvtjp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.378924 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f055dad5-7c9b-46a1-a715-34847c30d0cf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-16T00:08:46Z\\\",\\\"message\\\":\\\".NetworkPolicy event handler 4\\\\nI0316 00:08:46.005091 7514 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0316 00:08:46.005096 7514 handler.go:208] Removed *v1.Node event handler 2\\\\nI0316 00:08:46.005063 7514 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0316 00:08:46.005114 7514 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0316 00:08:46.005131 7514 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0316 00:08:46.005141 7514 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0316 00:08:46.005164 7514 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0316 00:08:46.005200 7514 factory.go:656] Stopping watch factory\\\\nI0316 00:08:46.005212 7514 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0316 00:08:46.005241 7514 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0316 00:08:46.005258 7514 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0316 00:08:46.005266 7514 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-16T00:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-16T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88s5k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-16T00:07:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wsfb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.390924 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31953544bd0fa46b044357334aa793b32518f898113bc52e7ffd101ea279931b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:12 crc kubenswrapper[4983]: I0316 00:09:12.405262 4983 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-16T00:07:54Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff5f251cb661c6b4b2c0f286653e4b75bc69ce42c3774467c397f0952b2089c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://160f794c022702655e128c5c4c15bcdedcd31f8e509d653408ad463c27bb6fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-16T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092443 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092482 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092482 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:13 crc kubenswrapper[4983]: I0316 00:09:13.092540 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093099 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093242 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093426 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:13 crc kubenswrapper[4983]: E0316 00:09:13.093551 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464295 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464341 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464352 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464392 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.464411 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.479587 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484180 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484254 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484276 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484306 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.484327 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.504365 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509353 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509427 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509452 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509482 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.509505 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.529309 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534007 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534070 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534088 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534111 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.534128 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.551020 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555816 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555856 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555864 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555878 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:14 crc kubenswrapper[4983]: I0316 00:09:14.555887 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:14Z","lastTransitionTime":"2026-03-16T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.573000 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-16T00:09:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"07bf7a14-97e0-4c5e-b357-db0b2f7bca2e\\\",\\\"systemUUID\\\":\\\"2ead470a-f520-44aa-9efc-f4170c7efbf2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-16T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 16 00:09:14 crc kubenswrapper[4983]: E0316 00:09:14.573103 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092224 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092311 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092384 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092407 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:15 crc kubenswrapper[4983]: I0316 00:09:15.092309 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092508 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092613 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:15 crc kubenswrapper[4983]: E0316 00:09:15.092829 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092009 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092201 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.092236 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092200 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092208 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.092489 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:17 crc kubenswrapper[4983]: I0316 00:09:17.092790 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.092935 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wsfb4_openshift-ovn-kubernetes(f055dad5-7c9b-46a1-a715-34847c30d0cf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.093011 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.093095 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:17 crc kubenswrapper[4983]: E0316 00:09:17.196696 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092313 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.092463 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092486 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092565 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:19 crc kubenswrapper[4983]: I0316 00:09:19.092684 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.092796 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.092675 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:19 crc kubenswrapper[4983]: E0316 00:09:19.093244 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092628 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092656 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092727 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:21 crc kubenswrapper[4983]: I0316 00:09:21.092735 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.092878 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.093008 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.093166 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:21 crc kubenswrapper[4983]: E0316 00:09:21.093311 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.113248 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=80.113233765 podStartE2EDuration="1m20.113233765s" podCreationTimestamp="2026-03-16 00:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.112956058 +0000 UTC m=+170.713054488" watchObservedRunningTime="2026-03-16 00:09:22.113233765 +0000 UTC m=+170.713332195" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.129237 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.129222445 podStartE2EDuration="1m17.129222445s" podCreationTimestamp="2026-03-16 00:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.128726641 +0000 UTC m=+170.728825111" watchObservedRunningTime="2026-03-16 00:09:22.129222445 +0000 UTC m=+170.729320875" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.161250 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.161228604 podStartE2EDuration="42.161228604s" podCreationTimestamp="2026-03-16 00:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.160888814 +0000 UTC m=+170.760987244" watchObservedRunningTime="2026-03-16 00:09:22.161228604 +0000 UTC m=+170.761327044" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.187068 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.187049453 podStartE2EDuration="1m4.187049453s" podCreationTimestamp="2026-03-16 00:08:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.184129613 +0000 UTC m=+170.784228053" watchObservedRunningTime="2026-03-16 00:09:22.187049453 +0000 UTC m=+170.787147903" Mar 16 00:09:22 crc kubenswrapper[4983]: E0316 00:09:22.197184 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.238189 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v748m" podStartSLOduration=124.238166897 podStartE2EDuration="2m4.238166897s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.227539435 +0000 UTC m=+170.827637865" watchObservedRunningTime="2026-03-16 00:09:22.238166897 +0000 UTC m=+170.838265327" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.238626 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d2h5k" podStartSLOduration=124.23861867 podStartE2EDuration="2m4.23861867s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.238563588 +0000 UTC m=+170.838662018" watchObservedRunningTime="2026-03-16 00:09:22.23861867 +0000 UTC m=+170.838717100" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.257400 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podStartSLOduration=123.257385805 podStartE2EDuration="2m3.257385805s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.256934683 +0000 UTC m=+170.857033113" watchObservedRunningTime="2026-03-16 00:09:22.257385805 +0000 UTC m=+170.857484235" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.288280 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=50.288263853 podStartE2EDuration="50.288263853s" podCreationTimestamp="2026-03-16 00:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.268731897 +0000 UTC m=+170.868830327" watchObservedRunningTime="2026-03-16 00:09:22.288263853 +0000 UTC m=+170.888362283" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.324290 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tqncp" podStartSLOduration=123.324272032 podStartE2EDuration="2m3.324272032s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.3241911 +0000 UTC m=+170.924289530" watchObservedRunningTime="2026-03-16 00:09:22.324272032 +0000 UTC m=+170.924370462" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.366420 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hjpzf" podStartSLOduration=123.36639407 podStartE2EDuration="2m3.36639407s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.345795504 +0000 UTC m=+170.945893934" watchObservedRunningTime="2026-03-16 00:09:22.36639407 +0000 UTC m=+170.966492510" Mar 16 00:09:22 crc kubenswrapper[4983]: I0316 00:09:22.367041 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-pp6bs" podStartSLOduration=123.367034707 podStartE2EDuration="2m3.367034707s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:22.365936867 +0000 UTC m=+170.966035297" watchObservedRunningTime="2026-03-16 00:09:22.367034707 +0000 UTC m=+170.967133157" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092614 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092649 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.092896 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092939 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:23 crc kubenswrapper[4983]: I0316 00:09:23.092957 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.093095 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.093331 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:23 crc kubenswrapper[4983]: E0316 00:09:23.093523 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702320 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702363 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702372 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702387 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.702397 4983 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-16T00:09:24Z","lastTransitionTime":"2026-03-16T00:09:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.760638 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m"] Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.760994 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.763561 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.766414 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.766950 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.767167 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896229 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb13872-d3dc-4349-b763-f46e4cc112d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896283 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896319 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fb13872-d3dc-4349-b763-f46e4cc112d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896356 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fb13872-d3dc-4349-b763-f46e4cc112d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.896395 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997847 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb13872-d3dc-4349-b763-f46e4cc112d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997887 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997925 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fb13872-d3dc-4349-b763-f46e4cc112d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997951 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fb13872-d3dc-4349-b763-f46e4cc112d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.997970 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.998037 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.998157 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2fb13872-d3dc-4349-b763-f46e4cc112d5-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:24 crc kubenswrapper[4983]: I0316 00:09:24.998869 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2fb13872-d3dc-4349-b763-f46e4cc112d5-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.003357 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2fb13872-d3dc-4349-b763-f46e4cc112d5-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.016503 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2fb13872-d3dc-4349-b763-f46e4cc112d5-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tcm4m\" (UID: \"2fb13872-d3dc-4349-b763-f46e4cc112d5\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.076220 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.091514 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.091682 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.091983 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.092029 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.092259 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.092449 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.092710 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:25 crc kubenswrapper[4983]: E0316 00:09:25.092902 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.140593 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.148503 4983 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.938065 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" event={"ID":"2fb13872-d3dc-4349-b763-f46e4cc112d5","Type":"ContainerStarted","Data":"fd381275245e78bc69f2ac7cb422a2e9888d68627fad78d28e5f114a9a1b7eb0"} Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.938202 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" event={"ID":"2fb13872-d3dc-4349-b763-f46e4cc112d5","Type":"ContainerStarted","Data":"fdc0ccc239745118906714b20613e4415d7d481bf0b908fa5ae77271fe1d1f8c"} Mar 16 00:09:25 crc kubenswrapper[4983]: I0316 00:09:25.955183 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tcm4m" podStartSLOduration=126.955158178 podStartE2EDuration="2m6.955158178s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:25.954913012 +0000 UTC m=+174.555011442" watchObservedRunningTime="2026-03-16 00:09:25.955158178 +0000 UTC m=+174.555256648" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.942610 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943329 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/0.log" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943355 4983 generic.go:334] "Generic (PLEG): container finished" podID="f81ec143-6c51-4f96-ae71-a4759bac7c70" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" exitCode=1 Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943379 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerDied","Data":"dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9"} Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943414 4983 scope.go:117] "RemoveContainer" containerID="05a9d280e6035b2f4588775cc6f5de28a82b4a25a47a9b6186575e8b0bed32d7" Mar 16 00:09:26 crc kubenswrapper[4983]: I0316 00:09:26.943746 4983 scope.go:117] "RemoveContainer" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" Mar 16 00:09:26 crc kubenswrapper[4983]: E0316 00:09:26.943954 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-tqncp_openshift-multus(f81ec143-6c51-4f96-ae71-a4759bac7c70)\"" pod="openshift-multus/multus-tqncp" podUID="f81ec143-6c51-4f96-ae71-a4759bac7c70" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091622 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.091778 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.091776 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.091889 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.092008 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.092064 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:27 crc kubenswrapper[4983]: E0316 00:09:27.198681 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:27 crc kubenswrapper[4983]: I0316 00:09:27.948877 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092230 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092273 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092385 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092236 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:29 crc kubenswrapper[4983]: I0316 00:09:29.092258 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092569 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092677 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:29 crc kubenswrapper[4983]: E0316 00:09:29.092835 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.093653 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.958185 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qvtjp"] Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.958306 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:30 crc kubenswrapper[4983]: E0316 00:09:30.958402 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.962316 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.964261 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerStarted","Data":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.965210 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:09:30 crc kubenswrapper[4983]: I0316 00:09:30.995173 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podStartSLOduration=131.995155931 podStartE2EDuration="2m11.995155931s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:30.992950341 +0000 UTC m=+179.593048771" watchObservedRunningTime="2026-03-16 00:09:30.995155931 +0000 UTC m=+179.595254361" Mar 16 00:09:31 crc kubenswrapper[4983]: I0316 00:09:31.091566 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:31 crc kubenswrapper[4983]: I0316 00:09:31.091566 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:31 crc kubenswrapper[4983]: E0316 00:09:31.091691 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:31 crc kubenswrapper[4983]: I0316 00:09:31.091574 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:31 crc kubenswrapper[4983]: E0316 00:09:31.091803 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:31 crc kubenswrapper[4983]: E0316 00:09:31.091872 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:32 crc kubenswrapper[4983]: I0316 00:09:32.092231 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:32 crc kubenswrapper[4983]: E0316 00:09:32.093217 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:32 crc kubenswrapper[4983]: E0316 00:09:32.199334 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:33 crc kubenswrapper[4983]: I0316 00:09:33.091822 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:33 crc kubenswrapper[4983]: I0316 00:09:33.091848 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:33 crc kubenswrapper[4983]: I0316 00:09:33.091932 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:33 crc kubenswrapper[4983]: E0316 00:09:33.091993 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:33 crc kubenswrapper[4983]: E0316 00:09:33.092023 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:33 crc kubenswrapper[4983]: E0316 00:09:33.092189 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:34 crc kubenswrapper[4983]: I0316 00:09:34.092108 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:34 crc kubenswrapper[4983]: E0316 00:09:34.093290 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:35 crc kubenswrapper[4983]: I0316 00:09:35.092044 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:35 crc kubenswrapper[4983]: I0316 00:09:35.092124 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:35 crc kubenswrapper[4983]: I0316 00:09:35.092128 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:35 crc kubenswrapper[4983]: E0316 00:09:35.092245 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:35 crc kubenswrapper[4983]: E0316 00:09:35.092391 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:35 crc kubenswrapper[4983]: E0316 00:09:35.092499 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:36 crc kubenswrapper[4983]: I0316 00:09:36.092610 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:36 crc kubenswrapper[4983]: E0316 00:09:36.092939 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:37 crc kubenswrapper[4983]: I0316 00:09:37.092494 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:37 crc kubenswrapper[4983]: I0316 00:09:37.092573 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.092686 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:37 crc kubenswrapper[4983]: I0316 00:09:37.092525 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.092962 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.093173 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:37 crc kubenswrapper[4983]: E0316 00:09:37.201358 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:38 crc kubenswrapper[4983]: I0316 00:09:38.092878 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:38 crc kubenswrapper[4983]: E0316 00:09:38.093076 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:39 crc kubenswrapper[4983]: I0316 00:09:39.092088 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:39 crc kubenswrapper[4983]: E0316 00:09:39.092486 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:39 crc kubenswrapper[4983]: I0316 00:09:39.092142 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:39 crc kubenswrapper[4983]: E0316 00:09:39.092568 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:39 crc kubenswrapper[4983]: I0316 00:09:39.092124 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:39 crc kubenswrapper[4983]: E0316 00:09:39.092642 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:40 crc kubenswrapper[4983]: I0316 00:09:40.092415 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:40 crc kubenswrapper[4983]: E0316 00:09:40.092571 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.091554 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.091645 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.091707 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:41 crc kubenswrapper[4983]: E0316 00:09:41.091701 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:41 crc kubenswrapper[4983]: E0316 00:09:41.091860 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:41 crc kubenswrapper[4983]: E0316 00:09:41.091919 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:41 crc kubenswrapper[4983]: I0316 00:09:41.092298 4983 scope.go:117] "RemoveContainer" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" Mar 16 00:09:42 crc kubenswrapper[4983]: I0316 00:09:42.005465 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:09:42 crc kubenswrapper[4983]: I0316 00:09:42.005560 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da"} Mar 16 00:09:42 crc kubenswrapper[4983]: I0316 00:09:42.092569 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:42 crc kubenswrapper[4983]: E0316 00:09:42.094585 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:42 crc kubenswrapper[4983]: E0316 00:09:42.202236 4983 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 16 00:09:43 crc kubenswrapper[4983]: I0316 00:09:43.092121 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:43 crc kubenswrapper[4983]: I0316 00:09:43.092176 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:43 crc kubenswrapper[4983]: E0316 00:09:43.092248 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:43 crc kubenswrapper[4983]: I0316 00:09:43.092137 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:43 crc kubenswrapper[4983]: E0316 00:09:43.092370 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:43 crc kubenswrapper[4983]: E0316 00:09:43.092606 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:44 crc kubenswrapper[4983]: I0316 00:09:44.092222 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:44 crc kubenswrapper[4983]: E0316 00:09:44.092388 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:45 crc kubenswrapper[4983]: I0316 00:09:45.091605 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:45 crc kubenswrapper[4983]: I0316 00:09:45.091683 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:45 crc kubenswrapper[4983]: I0316 00:09:45.091681 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:45 crc kubenswrapper[4983]: E0316 00:09:45.091779 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:45 crc kubenswrapper[4983]: E0316 00:09:45.091981 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:45 crc kubenswrapper[4983]: E0316 00:09:45.092052 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:46 crc kubenswrapper[4983]: I0316 00:09:46.091972 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:46 crc kubenswrapper[4983]: E0316 00:09:46.092221 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvtjp" podUID="6993dda4-ac10-47af-b406-d49d7781fbe5" Mar 16 00:09:47 crc kubenswrapper[4983]: I0316 00:09:47.092535 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:47 crc kubenswrapper[4983]: I0316 00:09:47.092615 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:47 crc kubenswrapper[4983]: E0316 00:09:47.092694 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 16 00:09:47 crc kubenswrapper[4983]: I0316 00:09:47.092818 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:47 crc kubenswrapper[4983]: E0316 00:09:47.092900 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 16 00:09:47 crc kubenswrapper[4983]: E0316 00:09:47.093042 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 16 00:09:48 crc kubenswrapper[4983]: I0316 00:09:48.092154 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:09:48 crc kubenswrapper[4983]: I0316 00:09:48.094673 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:09:48 crc kubenswrapper[4983]: I0316 00:09:48.095575 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.091947 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.092052 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.092064 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.096388 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.096888 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.097332 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:09:49 crc kubenswrapper[4983]: I0316 00:09:49.103813 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:09:53 crc kubenswrapper[4983]: I0316 00:09:53.448874 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:09:53 crc kubenswrapper[4983]: I0316 00:09:53.449808 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:09:53 crc kubenswrapper[4983]: I0316 00:09:53.496033 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.420480 4983 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.471156 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc9bv"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.472438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.473297 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t4lj8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.474120 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.474911 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.475335 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.475997 4983 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.476080 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.476661 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.477704 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482354 4983 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482446 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482561 4983 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482595 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482683 4983 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482713 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482856 4983 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.482891 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.482967 4983 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.483001 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: W0316 00:09:55.486249 4983 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 16 00:09:55 crc kubenswrapper[4983]: E0316 00:09:55.486485 4983 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.487110 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.487359 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.488205 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.488874 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9gcl"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.488936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.489496 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.489708 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.497452 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.498127 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.498660 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.499415 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.499862 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.499993 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.500324 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.500868 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.500974 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.518393 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-fp4l5"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.519629 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.521790 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.522182 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.523175 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.523528 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.530961 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531170 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.530531 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531278 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531549 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531656 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531690 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531812 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.531971 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532027 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532113 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532278 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532383 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.532492 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.535581 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.536408 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.536853 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lx4mf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537094 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537154 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537233 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.537274 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538241 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538365 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538411 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538627 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538635 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538843 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.538890 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539231 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539251 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539288 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539415 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539525 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539566 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.539631 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.540014 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.540236 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.541231 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.541442 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.541916 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.542090 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.542718 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543118 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543340 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543391 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.543742 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.544384 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.544823 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9k8tn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.545342 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.548630 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29560320-9tclx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550115 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550261 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550408 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.550485 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.551425 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.551602 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.552776 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.554003 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.558279 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.558666 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.562988 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.563200 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.563417 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.564278 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.564384 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.567092 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.567557 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-np9wn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568004 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568315 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568397 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.568697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.569405 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.570520 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-6j9qt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.601045 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qzvb8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.601818 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.604808 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.618916 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-w8qpq"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624476 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624645 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624694 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624920 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.624983 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.625206 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626032 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626152 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626436 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.626858 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627167 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627374 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627168 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627719 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627890 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627263 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627330 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627523 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.627574 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628362 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628443 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628547 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.628881 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.629064 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.629708 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.632312 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.632567 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.632966 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.633088 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.633118 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.634692 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.634956 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635407 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635573 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635733 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.635839 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636019 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636706 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcf712a-d77b-446c-b9e8-7083ff491d3c-serving-cert\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636737 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636780 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-config\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636797 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636815 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636840 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/288bbae2-d98f-4e70-8f83-314c8a7a038b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636856 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636867 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.636870 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n25x\" (UniqueName: \"kubernetes.io/projected/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-kube-api-access-6n25x\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637114 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dwvz\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-kube-api-access-9dwvz\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637135 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637152 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637206 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637223 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637287 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637306 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-config\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637346 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637367 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637383 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44hs\" (UniqueName: \"kubernetes.io/projected/ddcf712a-d77b-446c-b9e8-7083ff491d3c-kube-api-access-l44hs\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637400 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637426 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637440 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637457 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637475 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637517 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-trusted-ca\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637568 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/288bbae2-d98f-4e70-8f83-314c8a7a038b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637647 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637829 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-serving-cert\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.637975 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.638358 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.638523 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.641510 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.642078 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.642484 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.643104 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.643392 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.644099 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.644964 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.645892 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.646241 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t4lj8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.647273 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.648156 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ml6pw"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.649066 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.649159 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.649804 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.650583 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.651333 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.651737 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njztx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.654983 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t65x6"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.655926 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.657184 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.657608 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.657801 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.659186 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.660834 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n22z7"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.662165 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.663745 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.669303 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-82r5r"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.669588 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.671704 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.675919 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.676499 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.678251 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.678674 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.679685 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.680299 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.681047 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.682199 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.682717 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.684143 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc9bv"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.685188 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6j9qt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.686449 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qzvb8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.687574 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.689071 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.690225 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.691624 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.693052 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.694485 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fp4l5"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.694738 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.695746 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.697605 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lx4mf"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.698957 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.700395 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.701724 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.702982 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.704077 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.705317 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-9tclx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.706371 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.707346 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9gcl"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.710028 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9k8tn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.711689 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.713183 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.714480 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-np9wn"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.714842 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.715446 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.716885 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t65x6"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.718030 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.719110 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.720207 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.721342 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.722629 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mjkh8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.723444 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.723747 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5zxcb"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.724295 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.724887 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ml6pw"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.725849 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.726841 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.727924 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n22z7"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.729084 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njztx"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.730190 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.731215 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.732243 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mjkh8"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.733243 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5zxcb"] Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.734291 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738551 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-config\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738594 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738630 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738658 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/288bbae2-d98f-4e70-8f83-314c8a7a038b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738683 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n25x\" (UniqueName: \"kubernetes.io/projected/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-kube-api-access-6n25x\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738704 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738725 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dwvz\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-kube-api-access-9dwvz\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738747 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738796 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738820 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738899 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738930 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738953 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-config\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.738976 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739003 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44hs\" (UniqueName: \"kubernetes.io/projected/ddcf712a-d77b-446c-b9e8-7083ff491d3c-kube-api-access-l44hs\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739026 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739047 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739081 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739104 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739125 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739147 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739165 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-trusted-ca\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739181 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739196 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/288bbae2-d98f-4e70-8f83-314c8a7a038b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739219 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-serving-cert\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739236 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcf712a-d77b-446c-b9e8-7083ff491d3c-serving-cert\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739252 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.739414 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-config\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.740528 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.741560 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.741916 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.742110 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.742416 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-config\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.742483 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.743201 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744337 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744737 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744849 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/288bbae2-d98f-4e70-8f83-314c8a7a038b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.744946 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.745864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.746278 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/288bbae2-d98f-4e70-8f83-314c8a7a038b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.746712 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.747007 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.747064 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ddcf712a-d77b-446c-b9e8-7083ff491d3c-trusted-ca\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.747786 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.748337 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ddcf712a-d77b-446c-b9e8-7083ff491d3c-serving-cert\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.749088 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.749135 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.750304 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-serving-cert\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.755374 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.775288 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.805745 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.817113 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.838948 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.854286 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.875274 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.894356 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.914559 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.934553 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.955395 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.974982 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:09:55 crc kubenswrapper[4983]: I0316 00:09:55.994563 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.015147 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.034489 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.055116 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.074172 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.094499 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.114119 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.134016 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.154582 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.174995 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.194790 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.214688 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.235111 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.255434 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.294793 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.315173 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.335673 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345249 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-config\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345288 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-client\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345314 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-encryption-config\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345339 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2688c073-5209-4258-a681-186370d9abcc-machine-approver-tls\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345358 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-policies\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345382 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd00ffd-95e2-47bf-a6fd-663526b2283d-trusted-ca\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345480 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345507 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfpdm\" (UniqueName: \"kubernetes.io/projected/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-kube-api-access-jfpdm\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345563 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-serving-cert\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345606 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29gpp\" (UniqueName: \"kubernetes.io/projected/9c737bbb-9153-4689-bbd7-1925cd53b343-kube-api-access-29gpp\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345653 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345786 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345825 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5kgk\" (UniqueName: \"kubernetes.io/projected/211771ed-66f1-4866-b193-5da61bbd38b4-kube-api-access-l5kgk\") pod \"downloads-7954f5f757-6j9qt\" (UID: \"211771ed-66f1-4866-b193-5da61bbd38b4\") " pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345875 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-oauth-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345909 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldpp\" (UniqueName: \"kubernetes.io/projected/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-kube-api-access-6ldpp\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.345960 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.345978 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:56.845964423 +0000 UTC m=+205.446062963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346000 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346059 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-auth-proxy-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346080 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346127 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-dir\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346148 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346190 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346215 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-image-import-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346236 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346276 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346299 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b0e4e23-a158-4597-b005-db088a652ec8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346328 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqtq5\" (UniqueName: \"kubernetes.io/projected/d76474c2-7d5c-45a0-8869-d829b0c594d6-kube-api-access-kqtq5\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346375 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzzmt\" (UniqueName: \"kubernetes.io/projected/2688c073-5209-4258-a681-186370d9abcc-kube-api-access-bzzmt\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346397 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346439 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346459 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-service-ca\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346510 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346530 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346554 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346603 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346624 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346697 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346723 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs6fr\" (UniqueName: \"kubernetes.io/projected/bcce228b-5abb-4cbb-8f79-57326a3a9665-kube-api-access-fs6fr\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346744 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346803 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346872 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346896 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-oauth-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346918 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.346966 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-encryption-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347002 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347057 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit-dir\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347083 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8lpz\" (UniqueName: \"kubernetes.io/projected/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-kube-api-access-g8lpz\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347111 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347138 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcce228b-5abb-4cbb-8f79-57326a3a9665-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347167 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpkm\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-kube-api-access-gwpkm\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347222 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347249 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347284 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcce228b-5abb-4cbb-8f79-57326a3a9665-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347310 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-images\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347332 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczdg\" (UniqueName: \"kubernetes.io/projected/6b0e4e23-a158-4597-b005-db088a652ec8-kube-api-access-zczdg\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347357 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-trusted-ca-bundle\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-node-pullsecrets\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347477 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347499 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347523 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347562 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347597 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-metrics-tls\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.347621 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebd00ffd-95e2-47bf-a6fd-663526b2283d-metrics-tls\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.355553 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.376851 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.396101 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.415194 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.435295 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448190 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.448350 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:56.948328028 +0000 UTC m=+205.548426468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448405 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448453 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/de84a408-0f98-48c6-83a5-e6976b576989-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448486 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-certs\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448522 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448560 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e46a85-c462-4ef3-a944-6ed47d2b0598-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448597 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcce228b-5abb-4cbb-8f79-57326a3a9665-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448631 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-images\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448667 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zczdg\" (UniqueName: \"kubernetes.io/projected/6b0e4e23-a158-4597-b005-db088a652ec8-kube-api-access-zczdg\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448696 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-trusted-ca-bundle\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448726 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-node-pullsecrets\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448809 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448847 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448903 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448962 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-socket-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.448994 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-config\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449003 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-node-pullsecrets\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449026 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-encryption-config\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449161 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-key\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449236 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2688c073-5209-4258-a681-186370d9abcc-machine-approver-tls\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449288 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-tmpfs\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449345 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f53f35-efe7-4b1c-9a25-d82b824c156f-config\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449401 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-default-certificate\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449506 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef72d9a-3e65-495f-8e73-ee539c10a29e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449558 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449631 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfpdm\" (UniqueName: \"kubernetes.io/projected/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-kube-api-access-jfpdm\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449698 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-images\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449701 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5kgk\" (UniqueName: \"kubernetes.io/projected/211771ed-66f1-4866-b193-5da61bbd38b4-kube-api-access-l5kgk\") pod \"downloads-7954f5f757-6j9qt\" (UID: \"211771ed-66f1-4866-b193-5da61bbd38b4\") " pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-config\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449830 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwxb\" (UniqueName: \"kubernetes.io/projected/34398886-1821-47c0-bbff-951177287627-kube-api-access-9jwxb\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449854 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449876 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssz5w\" (UniqueName: \"kubernetes.io/projected/31984625-3905-4d4d-9c52-e7d11c6c15d4-kube-api-access-ssz5w\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449899 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/211c2269-7173-4fcb-9403-be48b10ab364-metrics-tls\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449933 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.449972 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450016 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qnw\" (UniqueName: \"kubernetes.io/projected/de84a408-0f98-48c6-83a5-e6976b576989-kube-api-access-q6qnw\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450038 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450087 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450108 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450134 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e46a85-c462-4ef3-a944-6ed47d2b0598-config\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450149 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94hh\" (UniqueName: \"kubernetes.io/projected/33ff8ce9-2d36-4251-ae9d-802d9965bfde-kube-api-access-w94hh\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450165 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450181 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450200 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450225 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b0e4e23-a158-4597-b005-db088a652ec8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450297 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-cabundle\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450338 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450395 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450429 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450478 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450518 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmbs\" (UniqueName: \"kubernetes.io/projected/61000119-35ce-40ee-a8c5-5ad9052b539d-kube-api-access-fnmbs\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450552 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450571 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450604 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450672 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450737 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22b9ac88-75ea-4572-bd27-f819caf4d8e2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450878 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-etcd-client\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450956 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs6fr\" (UniqueName: \"kubernetes.io/projected/bcce228b-5abb-4cbb-8f79-57326a3a9665-kube-api-access-fs6fr\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451402 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfea0242-abc1-4912-a193-6c4dc75d9bb5-service-ca-bundle\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451475 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451539 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ff8ce9-2d36-4251-ae9d-802d9965bfde-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451645 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-oauth-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451701 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451483 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.451746 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-656xd\" (UniqueName: \"kubernetes.io/projected/9c413c46-e4ff-43f2-b66a-8a62e1f08890-kube-api-access-656xd\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452237 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452278 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnznm\" (UniqueName: \"kubernetes.io/projected/8820c8ae-e5d3-4c91-8724-ec666e783179-kube-api-access-tnznm\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452303 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef72d9a-3e65-495f-8e73-ee539c10a29e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452329 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-encryption-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452353 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452369 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit-dir\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452415 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8lpz\" (UniqueName: \"kubernetes.io/projected/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-kube-api-access-g8lpz\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452434 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"auto-csr-approver-29560328-sngnj\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcce228b-5abb-4cbb-8f79-57326a3a9665-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452498 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwpkm\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-kube-api-access-gwpkm\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452515 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-webhook-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452533 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-stats-auth\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.452817 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453052 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-encryption-config\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453148 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b0e4e23-a158-4597-b005-db088a652ec8-config\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453226 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcce228b-5abb-4cbb-8f79-57326a3a9665-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453290 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-audit-dir\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453387 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453607 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2688c073-5209-4258-a681-186370d9abcc-machine-approver-tls\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.453671 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.450853 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-trusted-ca-bundle\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.454504 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.454883 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.455476 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.455612 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.456353 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-oauth-serving-cert\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457056 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ccd96-ced1-466f-8891-72abc221bbac-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457153 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-mountpoint-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457231 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bcce228b-5abb-4cbb-8f79-57326a3a9665-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.457973 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.458120 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31984625-3905-4d4d-9c52-e7d11c6c15d4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.458174 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-srv-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.458859 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b0e4e23-a158-4597-b005-db088a652ec8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.459210 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-encryption-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.460825 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.460995 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-registration-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.461170 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.461396 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.461748 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.462351 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-metrics-tls\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.462531 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebd00ffd-95e2-47bf-a6fd-663526b2283d-metrics-tls\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.462666 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-client\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463222 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnnxd\" (UniqueName: \"kubernetes.io/projected/22b9ac88-75ea-4572-bd27-f819caf4d8e2-kube-api-access-rnnxd\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463445 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-kube-api-access-72hqz\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463643 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9zp5\" (UniqueName: \"kubernetes.io/projected/dfea0242-abc1-4912-a193-6c4dc75d9bb5-kube-api-access-q9zp5\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463812 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-policies\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.463983 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd00ffd-95e2-47bf-a6fd-663526b2283d-trusted-ca\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464149 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cp78\" (UniqueName: \"kubernetes.io/projected/55da5246-1df8-4666-ad7c-9407719b3abb-kube-api-access-9cp78\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464317 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bspth\" (UniqueName: \"kubernetes.io/projected/b227bf69-003e-4831-8ce3-a5b1f7f85c31-kube-api-access-bspth\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464504 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465015 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-serving-cert\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465178 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29gpp\" (UniqueName: \"kubernetes.io/projected/9c737bbb-9153-4689-bbd7-1925cd53b343-kube-api-access-29gpp\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465342 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptgzs\" (UniqueName: \"kubernetes.io/projected/5373e962-abd6-4153-9cc9-7d17b9ae5fe5-kube-api-access-ptgzs\") pod \"migrator-59844c95c7-wp86n\" (UID: \"5373e962-abd6-4153-9cc9-7d17b9ae5fe5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465574 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465727 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.465954 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-oauth-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.466041 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:56.966022046 +0000 UTC m=+205.566120486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466240 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31984625-3905-4d4d-9c52-e7d11c6c15d4-proxy-tls\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466360 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466448 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.464692 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-policies\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.466566 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ebd00ffd-95e2-47bf-a6fd-663526b2283d-trusted-ca\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467028 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldpp\" (UniqueName: \"kubernetes.io/projected/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-kube-api-access-6ldpp\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467183 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p57k\" (UniqueName: \"kubernetes.io/projected/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-kube-api-access-7p57k\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467424 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2zdq\" (UniqueName: \"kubernetes.io/projected/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-kube-api-access-k2zdq\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.467574 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-metrics-certs\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468355 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468255 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-metrics-tls\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468505 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-etcd-client\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468702 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.468941 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469108 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlk8h\" (UniqueName: \"kubernetes.io/projected/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-kube-api-access-hlk8h\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469261 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-auth-proxy-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469405 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-serving-cert\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469553 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469208 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469858 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz6gs\" (UniqueName: \"kubernetes.io/projected/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-kube-api-access-jz6gs\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.469983 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ebd00ffd-95e2-47bf-a6fd-663526b2283d-metrics-tls\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470002 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-images\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470338 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-dir\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470482 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470616 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-profile-collector-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2688c073-5209-4258-a681-186370d9abcc-auth-proxy-config\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470892 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c737bbb-9153-4689-bbd7-1925cd53b343-audit-dir\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.470923 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-image-import-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471162 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f53f35-efe7-4b1c-9a25-d82b824c156f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471322 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471486 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471650 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqtq5\" (UniqueName: \"kubernetes.io/projected/d76474c2-7d5c-45a0-8869-d829b0c594d6-kube-api-access-kqtq5\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471829 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzzmt\" (UniqueName: \"kubernetes.io/projected/2688c073-5209-4258-a681-186370d9abcc-kube-api-access-bzzmt\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471861 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.471996 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb9f9\" (UniqueName: \"kubernetes.io/projected/211c2269-7173-4fcb-9403-be48b10ab364-kube-api-access-zb9f9\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472225 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472302 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472465 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-service-ca\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472529 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-image-import-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472549 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b227bf69-003e-4831-8ce3-a5b1f7f85c31-serving-cert\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472612 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472663 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stng\" (UniqueName: \"kubernetes.io/projected/74d1b439-9506-4a1a-a1a4-3f5ca7944750-kube-api-access-8stng\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472705 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-node-bootstrap-token\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472746 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472824 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472852 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472877 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-plugins-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472895 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e46a85-c462-4ef3-a944-6ed47d2b0598-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472912 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6f53f35-efe7-4b1c-9a25-d82b824c156f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472928 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b227bf69-003e-4831-8ce3-a5b1f7f85c31-config\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472944 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/143ccd96-ced1-466f-8891-72abc221bbac-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472953 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c737bbb-9153-4689-bbd7-1925cd53b343-serving-cert\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.472962 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143ccd96-ced1-466f-8891-72abc221bbac-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473045 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-srv-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473082 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74d1b439-9506-4a1a-a1a4-3f5ca7944750-proxy-tls\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473126 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqlkf\" (UniqueName: \"kubernetes.io/projected/aef72d9a-3e65-495f-8e73-ee539c10a29e-kube-api-access-pqlkf\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473184 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473261 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-service-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473292 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211c2269-7173-4fcb-9403-be48b10ab364-config-volume\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473350 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.473381 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-csi-data-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.475150 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.475272 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c737bbb-9153-4689-bbd7-1925cd53b343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.476003 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d76474c2-7d5c-45a0-8869-d829b0c594d6-service-ca\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477061 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477182 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477319 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.477501 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d76474c2-7d5c-45a0-8869-d829b0c594d6-console-oauth-config\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.478082 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-serving-ca\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.482397 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.494410 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.515362 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.534332 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.554998 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.574695 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575250 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575527 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.575670 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.075531205 +0000 UTC m=+205.675629665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575849 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptgzs\" (UniqueName: \"kubernetes.io/projected/5373e962-abd6-4153-9cc9-7d17b9ae5fe5-kube-api-access-ptgzs\") pod \"migrator-59844c95c7-wp86n\" (UID: \"5373e962-abd6-4153-9cc9-7d17b9ae5fe5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575883 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575910 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31984625-3905-4d4d-9c52-e7d11c6c15d4-proxy-tls\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575948 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p57k\" (UniqueName: \"kubernetes.io/projected/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-kube-api-access-7p57k\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575977 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2zdq\" (UniqueName: \"kubernetes.io/projected/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-kube-api-access-k2zdq\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.575999 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-metrics-certs\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.576018 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.576054 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlk8h\" (UniqueName: \"kubernetes.io/projected/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-kube-api-access-hlk8h\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.576512 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.076495453 +0000 UTC m=+205.676593893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577095 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-serving-cert\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577210 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577313 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz6gs\" (UniqueName: \"kubernetes.io/projected/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-kube-api-access-jz6gs\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577420 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-images\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577524 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577630 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-profile-collector-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577737 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f53f35-efe7-4b1c-9a25-d82b824c156f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577860 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.577978 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb9f9\" (UniqueName: \"kubernetes.io/projected/211c2269-7173-4fcb-9403-be48b10ab364-kube-api-access-zb9f9\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578074 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b227bf69-003e-4831-8ce3-a5b1f7f85c31-serving-cert\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578191 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stng\" (UniqueName: \"kubernetes.io/projected/74d1b439-9506-4a1a-a1a4-3f5ca7944750-kube-api-access-8stng\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578301 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-node-bootstrap-token\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578410 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578509 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-plugins-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578611 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e46a85-c462-4ef3-a944-6ed47d2b0598-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578915 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6f53f35-efe7-4b1c-9a25-d82b824c156f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b227bf69-003e-4831-8ce3-a5b1f7f85c31-config\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578516 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-auth-proxy-config\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.578942 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-plugins-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579228 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/143ccd96-ced1-466f-8891-72abc221bbac-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579412 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579428 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143ccd96-ced1-466f-8891-72abc221bbac-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579485 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-srv-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74d1b439-9506-4a1a-a1a4-3f5ca7944750-proxy-tls\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579563 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqlkf\" (UniqueName: \"kubernetes.io/projected/aef72d9a-3e65-495f-8e73-ee539c10a29e-kube-api-access-pqlkf\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-service-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579665 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211c2269-7173-4fcb-9403-be48b10ab364-config-volume\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579790 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-csi-data-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579833 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/de84a408-0f98-48c6-83a5-e6976b576989-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-certs\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579925 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-metrics-certs\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579941 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-csi-data-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.579998 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e46a85-c462-4ef3-a944-6ed47d2b0598-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580194 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580253 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-socket-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-key\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580320 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-tmpfs\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580351 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f53f35-efe7-4b1c-9a25-d82b824c156f-config\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580380 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-default-certificate\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580430 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef72d9a-3e65-495f-8e73-ee539c10a29e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580495 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-config\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580536 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-service-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580539 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwxb\" (UniqueName: \"kubernetes.io/projected/34398886-1821-47c0-bbff-951177287627-kube-api-access-9jwxb\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580611 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssz5w\" (UniqueName: \"kubernetes.io/projected/31984625-3905-4d4d-9c52-e7d11c6c15d4-kube-api-access-ssz5w\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580653 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/211c2269-7173-4fcb-9403-be48b10ab364-metrics-tls\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580694 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580777 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qnw\" (UniqueName: \"kubernetes.io/projected/de84a408-0f98-48c6-83a5-e6976b576989-kube-api-access-q6qnw\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580816 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580873 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e46a85-c462-4ef3-a944-6ed47d2b0598-config\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94hh\" (UniqueName: \"kubernetes.io/projected/33ff8ce9-2d36-4251-ae9d-802d9965bfde-kube-api-access-w94hh\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580937 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.580970 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-cabundle\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581003 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581058 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmbs\" (UniqueName: \"kubernetes.io/projected/61000119-35ce-40ee-a8c5-5ad9052b539d-kube-api-access-fnmbs\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581089 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581310 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-tmpfs\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.581974 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-socket-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582255 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e46a85-c462-4ef3-a944-6ed47d2b0598-config\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582259 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aef72d9a-3e65-495f-8e73-ee539c10a29e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582366 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22b9ac88-75ea-4572-bd27-f819caf4d8e2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582383 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-etcd-ca\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582405 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-etcd-client\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582434 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34398886-1821-47c0-bbff-951177287627-config\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582452 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfea0242-abc1-4912-a193-6c4dc75d9bb5-service-ca-bundle\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582488 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ff8ce9-2d36-4251-ae9d-802d9965bfde-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582525 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582558 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-656xd\" (UniqueName: \"kubernetes.io/projected/9c413c46-e4ff-43f2-b66a-8a62e1f08890-kube-api-access-656xd\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582589 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnznm\" (UniqueName: \"kubernetes.io/projected/8820c8ae-e5d3-4c91-8724-ec666e783179-kube-api-access-tnznm\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582618 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef72d9a-3e65-495f-8e73-ee539c10a29e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582661 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"auto-csr-approver-29560328-sngnj\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582703 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-webhook-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582735 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-stats-auth\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582810 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ccd96-ced1-466f-8891-72abc221bbac-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-mountpoint-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582887 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31984625-3905-4d4d-9c52-e7d11c6c15d4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582918 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-srv-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.582944 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-registration-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583030 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583068 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnnxd\" (UniqueName: \"kubernetes.io/projected/22b9ac88-75ea-4572-bd27-f819caf4d8e2-kube-api-access-rnnxd\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583098 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-kube-api-access-72hqz\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583132 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9zp5\" (UniqueName: \"kubernetes.io/projected/dfea0242-abc1-4912-a193-6c4dc75d9bb5-kube-api-access-q9zp5\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583164 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cp78\" (UniqueName: \"kubernetes.io/projected/55da5246-1df8-4666-ad7c-9407719b3abb-kube-api-access-9cp78\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583186 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bspth\" (UniqueName: \"kubernetes.io/projected/b227bf69-003e-4831-8ce3-a5b1f7f85c31-kube-api-access-bspth\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583485 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfea0242-abc1-4912-a193-6c4dc75d9bb5-service-ca-bundle\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583504 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143ccd96-ced1-466f-8891-72abc221bbac-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.583737 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-mountpoint-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.584230 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8820c8ae-e5d3-4c91-8724-ec666e783179-registration-dir\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.584356 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143ccd96-ced1-466f-8891-72abc221bbac-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.584867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31984625-3905-4d4d-9c52-e7d11c6c15d4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.585421 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.585468 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-serving-cert\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.585884 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e46a85-c462-4ef3-a944-6ed47d2b0598-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.586692 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33ff8ce9-2d36-4251-ae9d-802d9965bfde-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.586743 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-default-certificate\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.587404 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/34398886-1821-47c0-bbff-951177287627-etcd-client\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.587694 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aef72d9a-3e65-495f-8e73-ee539c10a29e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.588090 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.588626 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/dfea0242-abc1-4912-a193-6c4dc75d9bb5-stats-auth\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.594993 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.614974 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.621405 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f53f35-efe7-4b1c-9a25-d82b824c156f-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.638594 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.653414 4983 request.go:700] Waited for 1.009011976s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/secrets?fieldSelector=metadata.name%3Dkube-controller-manager-operator-dockercfg-gkqpw&limit=500&resourceVersion=0 Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.655157 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.674538 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.682309 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f53f35-efe7-4b1c-9a25-d82b824c156f-config\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.684297 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.684596 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.184561669 +0000 UTC m=+205.784660169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.685962 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.686462 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.186442785 +0000 UTC m=+205.786541255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.695021 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.702327 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31984625-3905-4d4d-9c52-e7d11c6c15d4-proxy-tls\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.715814 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.735547 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.747440 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/22b9ac88-75ea-4572-bd27-f819caf4d8e2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.754947 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.775874 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.787395 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.787531 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.287503721 +0000 UTC m=+205.887602191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.787813 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.788144 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.28812907 +0000 UTC m=+205.888227500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.796313 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.815608 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.825574 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/de84a408-0f98-48c6-83a5-e6976b576989-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.834727 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.855882 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.867259 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-srv-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.876292 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.881121 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.882095 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-profile-collector-cert\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.885672 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.889739 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.890015 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.389991271 +0000 UTC m=+205.990089701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.890253 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.890610 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.390601419 +0000 UTC m=+205.990699849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.894452 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.898725 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/74d1b439-9506-4a1a-a1a4-3f5ca7944750-images\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.915421 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.922470 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/74d1b439-9506-4a1a-a1a4-3f5ca7944750-proxy-tls\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.934238 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.955618 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.975009 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.993189 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:56 crc kubenswrapper[4983]: E0316 00:09:56.994228 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.494213251 +0000 UTC m=+206.094311681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:56 crc kubenswrapper[4983]: I0316 00:09:56.995575 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.014944 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.024933 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-key\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.035566 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.043499 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-signing-cabundle\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.056329 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.075717 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.096249 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.096932 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.596909837 +0000 UTC m=+206.197008307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.114732 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.134857 4983 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.154969 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.175404 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.185281 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/61000119-35ce-40ee-a8c5-5ad9052b539d-srv-cert\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.194964 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.197530 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.197733 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.697704445 +0000 UTC m=+206.297802885 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.198417 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.198921 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.698910391 +0000 UTC m=+206.299008831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.203585 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-node-bootstrap-token\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.215005 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.224946 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55da5246-1df8-4666-ad7c-9407719b3abb-certs\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.235307 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.254622 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.266598 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-apiservice-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.268407 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-webhook-cert\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.274413 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.281183 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b227bf69-003e-4831-8ce3-a5b1f7f85c31-serving-cert\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.294466 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.299567 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.300366 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.800289897 +0000 UTC m=+206.400388367 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.300620 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.301245 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.801227535 +0000 UTC m=+206.401325975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.314401 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.333954 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.341046 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b227bf69-003e-4831-8ce3-a5b1f7f85c31-config\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.355001 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.374968 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.394231 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.397113 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.401845 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.401975 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.901956851 +0000 UTC m=+206.502055281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.402622 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.402979 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.902971352 +0000 UTC m=+206.503069782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.421720 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.445545 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.450600 4983 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.450724 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert podName:8ac1b1cc-8499-493f-a8d9-801eb433163f nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.950695256 +0000 UTC m=+206.550793726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert") pod "route-controller-manager-6576b87f9c-rvjb2" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.451747 4983 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.451922 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert podName:249f0516-0237-4ba3-92eb-a7aa3b9c62c1 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.951885552 +0000 UTC m=+206.551984032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert") pod "apiserver-76f77b778f-lc9bv" (UID: "249f0516-0237-4ba3-92eb-a7aa3b9c62c1") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.455054 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.455921 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.458614 4983 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.458701 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config podName:249f0516-0237-4ba3-92eb-a7aa3b9c62c1 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.958682104 +0000 UTC m=+206.558780554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config") pod "apiserver-76f77b778f-lc9bv" (UID: "249f0516-0237-4ba3-92eb-a7aa3b9c62c1") : failed to sync configmap cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.463498 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.473381 4983 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.473501 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client podName:249f0516-0237-4ba3-92eb-a7aa3b9c62c1 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:57.973467936 +0000 UTC m=+206.573566576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client") pod "apiserver-76f77b778f-lc9bv" (UID: "249f0516-0237-4ba3-92eb-a7aa3b9c62c1") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.475150 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.495864 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.505623 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.505865 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.005832542 +0000 UTC m=+206.605931012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.506243 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.506721 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.006704608 +0000 UTC m=+206.606803078 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.514556 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.523492 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/211c2269-7173-4fcb-9403-be48b10ab364-config-volume\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.537325 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.554994 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.565501 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/211c2269-7173-4fcb-9403-be48b10ab364-metrics-tls\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.575289 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.575725 4983 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.575922 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert podName:9c413c46-e4ff-43f2-b66a-8a62e1f08890 nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.075898133 +0000 UTC m=+206.675996593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert") pod "ingress-canary-5zxcb" (UID: "9c413c46-e4ff-43f2-b66a-8a62e1f08890") : failed to sync secret cache: timed out waiting for the condition Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.595287 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.607276 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.607552 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.107503216 +0000 UTC m=+206.707601656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.608516 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.609728 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.10963691 +0000 UTC m=+206.709735490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.616305 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.637488 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.673146 4983 request.go:700] Waited for 1.931990934s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.683913 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"oauth-openshift-558db77b4-df6gg\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.696432 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n25x\" (UniqueName: \"kubernetes.io/projected/1c99aa5b-9126-4ff9-9931-c7d73b51a6dc-kube-api-access-6n25x\") pod \"authentication-operator-69f744f599-v9gcl\" (UID: \"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.710342 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.710575 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.210541222 +0000 UTC m=+206.810639672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.710787 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.711179 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.21115631 +0000 UTC m=+206.811254780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.715199 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.742146 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dwvz\" (UniqueName: \"kubernetes.io/projected/288bbae2-d98f-4e70-8f83-314c8a7a038b-kube-api-access-9dwvz\") pod \"cluster-image-registry-operator-dc59b4c8b-8xj2p\" (UID: \"288bbae2-d98f-4e70-8f83-314c8a7a038b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.748616 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.754822 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.758902 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44hs\" (UniqueName: \"kubernetes.io/projected/ddcf712a-d77b-446c-b9e8-7083ff491d3c-kube-api-access-l44hs\") pod \"console-operator-58897d9998-lx4mf\" (UID: \"ddcf712a-d77b-446c-b9e8-7083ff491d3c\") " pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.761821 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.799563 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.804603 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.812630 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5kgk\" (UniqueName: \"kubernetes.io/projected/211771ed-66f1-4866-b193-5da61bbd38b4-kube-api-access-l5kgk\") pod \"downloads-7954f5f757-6j9qt\" (UID: \"211771ed-66f1-4866-b193-5da61bbd38b4\") " pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.812793 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.812965 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.312940638 +0000 UTC m=+206.913039098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.813459 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.813849 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.313832405 +0000 UTC m=+206.913930845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.833063 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfpdm\" (UniqueName: \"kubernetes.io/projected/d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7-kube-api-access-jfpdm\") pod \"openshift-config-operator-7777fb866f-np9wn\" (UID: \"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.866464 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.876707 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.878936 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.882125 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.929920 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:57 crc kubenswrapper[4983]: E0316 00:09:57.936100 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.436074623 +0000 UTC m=+207.036173063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.936623 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"controller-manager-879f6c89f-9hbqr\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.938552 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs6fr\" (UniqueName: \"kubernetes.io/projected/bcce228b-5abb-4cbb-8f79-57326a3a9665-kube-api-access-fs6fr\") pod \"openshift-apiserver-operator-796bbdcf4f-bjbs9\" (UID: \"bcce228b-5abb-4cbb-8f79-57326a3a9665\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.952850 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8lpz\" (UniqueName: \"kubernetes.io/projected/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-kube-api-access-g8lpz\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.953548 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwpkm\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-kube-api-access-gwpkm\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.970205 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:57 crc kubenswrapper[4983]: I0316 00:09:57.995371 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29gpp\" (UniqueName: \"kubernetes.io/projected/9c737bbb-9153-4689-bbd7-1925cd53b343-kube-api-access-29gpp\") pod \"apiserver-7bbb656c7d-56ljn\" (UID: \"9c737bbb-9153-4689-bbd7-1925cd53b343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.018794 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldpp\" (UniqueName: \"kubernetes.io/projected/3b3cc32d-4d8c-47ee-bf9c-2319482ab78f-kube-api-access-6ldpp\") pod \"dns-operator-744455d44c-9k8tn\" (UID: \"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f\") " pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.036738 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.036882 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.036954 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.037050 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.037136 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.037511 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.53749283 +0000 UTC m=+207.137591260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.039455 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"image-pruner-29560320-9tclx\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.054303 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqtq5\" (UniqueName: \"kubernetes.io/projected/d76474c2-7d5c-45a0-8869-d829b0c594d6-kube-api-access-kqtq5\") pod \"console-f9d7485db-fp4l5\" (UID: \"d76474c2-7d5c-45a0-8869-d829b0c594d6\") " pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.078214 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzzmt\" (UniqueName: \"kubernetes.io/projected/2688c073-5209-4258-a681-186370d9abcc-kube-api-access-bzzmt\") pod \"machine-approver-56656f9798-l59k2\" (UID: \"2688c073-5209-4258-a681-186370d9abcc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.082052 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.086029 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.094092 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ebd00ffd-95e2-47bf-a6fd-663526b2283d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vbdjp\" (UID: \"ebd00ffd-95e2-47bf-a6fd-663526b2283d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.095655 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.095839 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.099708 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-config\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.112213 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.119883 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9gcl"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.123427 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.126379 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.131973 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.133293 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptgzs\" (UniqueName: \"kubernetes.io/projected/5373e962-abd6-4153-9cc9-7d17b9ae5fe5-kube-api-access-ptgzs\") pod \"migrator-59844c95c7-wp86n\" (UID: \"5373e962-abd6-4153-9cc9-7d17b9ae5fe5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.138446 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.138640 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.142947 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.642916847 +0000 UTC m=+207.243015277 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.145981 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9c413c46-e4ff-43f2-b66a-8a62e1f08890-cert\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.150690 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p57k\" (UniqueName: \"kubernetes.io/projected/3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1-kube-api-access-7p57k\") pod \"service-ca-9c57cc56f-t65x6\" (UID: \"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.159727 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.187179 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-lx4mf"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.191099 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.205857 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlk8h\" (UniqueName: \"kubernetes.io/projected/8662dd30-6a4c-4a3d-a3bb-8d24821241fa-kube-api-access-hlk8h\") pod \"packageserver-d55dfcdfc-mw6rk\" (UID: \"8662dd30-6a4c-4a3d-a3bb-8d24821241fa\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.217316 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz6gs\" (UniqueName: \"kubernetes.io/projected/53e96ad5-bed1-4cf2-acf0-7f61294d16a7-kube-api-access-jz6gs\") pod \"kube-storage-version-migrator-operator-b67b599dd-76l8x\" (UID: \"53e96ad5-bed1-4cf2-acf0-7f61294d16a7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.224436 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-6j9qt"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.229745 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb9f9\" (UniqueName: \"kubernetes.io/projected/211c2269-7173-4fcb-9403-be48b10ab364-kube-api-access-zb9f9\") pod \"dns-default-mjkh8\" (UID: \"211c2269-7173-4fcb-9403-be48b10ab364\") " pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.239603 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.239890 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.739879781 +0000 UTC m=+207.339978211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.252340 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.257650 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stng\" (UniqueName: \"kubernetes.io/projected/74d1b439-9506-4a1a-a1a4-3f5ca7944750-kube-api-access-8stng\") pod \"machine-config-operator-74547568cd-njztx\" (UID: \"74d1b439-9506-4a1a-a1a4-3f5ca7944750\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.267888 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod211771ed_66f1_4866_b193_5da61bbd38b4.slice/crio-4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64 WatchSource:0}: Error finding container 4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64: Status 404 returned error can't find the container with id 4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.274873 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b6f53f35-efe7-4b1c-9a25-d82b824c156f-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vzrlf\" (UID: \"b6f53f35-efe7-4b1c-9a25-d82b824c156f\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.275140 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.284903 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.288336 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-np9wn"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.292477 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqlkf\" (UniqueName: \"kubernetes.io/projected/aef72d9a-3e65-495f-8e73-ee539c10a29e-kube-api-access-pqlkf\") pod \"openshift-controller-manager-operator-756b6f6bc6-zh8f9\" (UID: \"aef72d9a-3e65-495f-8e73-ee539c10a29e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.313240 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e46a85-c462-4ef3-a944-6ed47d2b0598-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-4vcp2\" (UID: \"94e46a85-c462-4ef3-a944-6ed47d2b0598\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.330862 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.333195 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwxb\" (UniqueName: \"kubernetes.io/projected/34398886-1821-47c0-bbff-951177287627-kube-api-access-9jwxb\") pod \"etcd-operator-b45778765-qzvb8\" (UID: \"34398886-1821-47c0-bbff-951177287627\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.340487 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.341095 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.84108119 +0000 UTC m=+207.441179610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.341116 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.342013 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.349480 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssz5w\" (UniqueName: \"kubernetes.io/projected/31984625-3905-4d4d-9c52-e7d11c6c15d4-kube-api-access-ssz5w\") pod \"machine-config-controller-84d6567774-6ppdt\" (UID: \"31984625-3905-4d4d-9c52-e7d11c6c15d4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.358600 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.371747 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94hh\" (UniqueName: \"kubernetes.io/projected/33ff8ce9-2d36-4251-ae9d-802d9965bfde-kube-api-access-w94hh\") pod \"cluster-samples-operator-665b6dd947-xdst8\" (UID: \"33ff8ce9-2d36-4251-ae9d-802d9965bfde\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.377266 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fp4l5"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.392398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qnw\" (UniqueName: \"kubernetes.io/projected/de84a408-0f98-48c6-83a5-e6976b576989-kube-api-access-q6qnw\") pod \"package-server-manager-789f6589d5-hqnds\" (UID: \"de84a408-0f98-48c6-83a5-e6976b576989\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.393267 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a768f3_aa53_481d_b179_5c8807f69e89.slice/crio-b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb WatchSource:0}: Error finding container b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb: Status 404 returned error can't find the container with id b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.397383 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd76474c2_7d5c_45a0_8869_d829b0c594d6.slice/crio-b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5 WatchSource:0}: Error finding container b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5: Status 404 returned error can't find the container with id b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.399389 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.418026 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"marketplace-operator-79b997595-tj49l\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.429236 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.433345 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"collect-profiles-29560320-mtrv4\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.438291 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.444873 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.445222 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:58.945210318 +0000 UTC m=+207.545308748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.451299 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmbs\" (UniqueName: \"kubernetes.io/projected/61000119-35ce-40ee-a8c5-5ad9052b539d-kube-api-access-fnmbs\") pod \"olm-operator-6b444d44fb-mhd52\" (UID: \"61000119-35ce-40ee-a8c5-5ad9052b539d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.471987 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnznm\" (UniqueName: \"kubernetes.io/projected/8820c8ae-e5d3-4c91-8724-ec666e783179-kube-api-access-tnznm\") pod \"csi-hostpathplugin-n22z7\" (UID: \"8820c8ae-e5d3-4c91-8724-ec666e783179\") " pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.472247 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.485819 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.489230 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bspth\" (UniqueName: \"kubernetes.io/projected/b227bf69-003e-4831-8ce3-a5b1f7f85c31-kube-api-access-bspth\") pod \"service-ca-operator-777779d784-m5q8d\" (UID: \"b227bf69-003e-4831-8ce3-a5b1f7f85c31\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.491270 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.521033 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29560320-9tclx"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.524990 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"auto-csr-approver-29560328-sngnj\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.530489 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.536895 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-656xd\" (UniqueName: \"kubernetes.io/projected/9c413c46-e4ff-43f2-b66a-8a62e1f08890-kube-api-access-656xd\") pod \"ingress-canary-5zxcb\" (UID: \"9c413c46-e4ff-43f2-b66a-8a62e1f08890\") " pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.546383 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.546580 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.046554563 +0000 UTC m=+207.646653003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.546841 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.547216 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.047205003 +0000 UTC m=+207.647303483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.549953 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnnxd\" (UniqueName: \"kubernetes.io/projected/22b9ac88-75ea-4572-bd27-f819caf4d8e2-kube-api-access-rnnxd\") pod \"multus-admission-controller-857f4d67dd-ml6pw\" (UID: \"22b9ac88-75ea-4572-bd27-f819caf4d8e2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.563014 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.568099 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.570338 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9zp5\" (UniqueName: \"kubernetes.io/projected/dfea0242-abc1-4912-a193-6c4dc75d9bb5-kube-api-access-q9zp5\") pod \"router-default-5444994796-w8qpq\" (UID: \"dfea0242-abc1-4912-a193-6c4dc75d9bb5\") " pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.596697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.598348 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72hqz\" (UniqueName: \"kubernetes.io/projected/f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d-kube-api-access-72hqz\") pod \"catalog-operator-68c6474976-ql6v2\" (UID: \"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.613400 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.616094 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.617742 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cp78\" (UniqueName: \"kubernetes.io/projected/55da5246-1df8-4666-ad7c-9407719b3abb-kube-api-access-9cp78\") pod \"machine-config-server-82r5r\" (UID: \"55da5246-1df8-4666-ad7c-9407719b3abb\") " pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.627563 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.629997 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-9k8tn"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.631244 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/143ccd96-ced1-466f-8891-72abc221bbac-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xnx49\" (UID: \"143ccd96-ced1-466f-8891-72abc221bbac\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.638417 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.645742 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.646173 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.651567 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.651982 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.151966539 +0000 UTC m=+207.752064969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.654530 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.654581 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.656698 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-etcd-client\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.662062 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.662274 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.665889 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.673844 4983 request.go:700] Waited for 1.840483423s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.675307 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.679074 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczdg\" (UniqueName: \"kubernetes.io/projected/6b0e4e23-a158-4597-b005-db088a652ec8-kube-api-access-zczdg\") pod \"machine-api-operator-5694c8668f-t4lj8\" (UID: \"6b0e4e23-a158-4597-b005-db088a652ec8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.684255 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2zdq\" (UniqueName: \"kubernetes.io/projected/bb33b891-4cdb-4fc1-95e4-2895f40fdb7a-kube-api-access-k2zdq\") pod \"control-plane-machine-set-operator-78cbb6b69f-5pqgr\" (UID: \"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.684299 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.691798 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-82r5r" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.696152 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.700397 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/249f0516-0237-4ba3-92eb-a7aa3b9c62c1-serving-cert\") pod \"apiserver-76f77b778f-lc9bv\" (UID: \"249f0516-0237-4ba3-92eb-a7aa3b9c62c1\") " pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.707275 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.707786 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f5bd50b_b197_4deb_ac50_768e3baa6cff.slice/crio-de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce WatchSource:0}: Error finding container de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce: Status 404 returned error can't find the container with id de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.715203 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.716229 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.721000 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t65x6"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.763093 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5zxcb" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.763996 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.766271 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.767604 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.26759008 +0000 UTC m=+207.867688510 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.784426 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.800599 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-njztx"] Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.803297 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"route-controller-manager-6576b87f9c-rvjb2\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.812653 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b3cc32d_4d8c_47ee_bf9c_2319482ab78f.slice/crio-213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08 WatchSource:0}: Error finding container 213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08: Status 404 returned error can't find the container with id 213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08 Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.817232 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcce228b_5abb_4cbb_8f79_57326a3a9665.slice/crio-3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553 WatchSource:0}: Error finding container 3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553: Status 404 returned error can't find the container with id 3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.818001 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.819188 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f53f35_efe7_4b1c_9a25_d82b824c156f.slice/crio-8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36 WatchSource:0}: Error finding container 8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36: Status 404 returned error can't find the container with id 8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36 Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.825440 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e96ad5_bed1_4cf2_acf0_7f61294d16a7.slice/crio-a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440 WatchSource:0}: Error finding container a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440: Status 404 returned error can't find the container with id a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440 Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.837673 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.841709 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.844388 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dee21fa_f6c7_4ef6_a99d_21ad42acd3e1.slice/crio-d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676 WatchSource:0}: Error finding container d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676: Status 404 returned error can't find the container with id d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676 Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.848411 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8662dd30_6a4c_4a3d_a3bb_8d24821241fa.slice/crio-267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a WatchSource:0}: Error finding container 267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a: Status 404 returned error can't find the container with id 267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.851820 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.855220 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d1b439_9506_4a1a_a1a4_3f5ca7944750.slice/crio-7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b WatchSource:0}: Error finding container 7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b: Status 404 returned error can't find the container with id 7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b Mar 16 00:09:58 crc kubenswrapper[4983]: W0316 00:09:58.867017 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2688c073_5209_4258_a681_186370d9abcc.slice/crio-0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e WatchSource:0}: Error finding container 0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e: Status 404 returned error can't find the container with id 0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.868165 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.868509 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.368494782 +0000 UTC m=+207.968593212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.868582 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:09:58 crc kubenswrapper[4983]: I0316 00:09:58.973549 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:58 crc kubenswrapper[4983]: E0316 00:09:58.973891 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.473877018 +0000 UTC m=+208.073975448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.075200 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.075656 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.575636565 +0000 UTC m=+208.175735005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.081735 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.083983 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mjkh8"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.085339 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerStarted","Data":"b0a22b729384eceaec70338f2664d98b4fd349a61381f59e8d8000d93e6ea45c"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.085377 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerStarted","Data":"868596c93911bc2ce0d03c99cb54a64d4a75b2790c09c5e6365f25e02d16c389"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.088691 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" event={"ID":"8662dd30-6a4c-4a3d-a3bb-8d24821241fa","Type":"ContainerStarted","Data":"267e1c669f5b6be78d2c5d428fe7e487b5235740a5be9f3a8c5313d3ed3e5b8a"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.103941 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" event={"ID":"ddcf712a-d77b-446c-b9e8-7083ff491d3c","Type":"ContainerStarted","Data":"d2688ca4b6c5c707a80fb06943ee2c21b4f8c5dd00db6134d8d4f77d5c364e05"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.103973 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" event={"ID":"ddcf712a-d77b-446c-b9e8-7083ff491d3c","Type":"ContainerStarted","Data":"d0961189be900e1c2dcda31be599fd30f44777d7f3a5703f8d03618f8973bb05"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.103989 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.117134 4983 patch_prober.go:28] interesting pod/console-operator-58897d9998-lx4mf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.117176 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" podUID="ddcf712a-d77b-446c-b9e8-7083ff491d3c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.122588 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" event={"ID":"bcce228b-5abb-4cbb-8f79-57326a3a9665","Type":"ContainerStarted","Data":"3ec71c3d2428b1483823a956e0140519b9b4d56c490cd7d07c693e2bdd3ce553"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.128138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" event={"ID":"ebd00ffd-95e2-47bf-a6fd-663526b2283d","Type":"ContainerStarted","Data":"5162cab02d86fc9683e9aa9db705e7180d844cda571df9dbd03bb411eb3f2b8c"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.129289 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" event={"ID":"9c737bbb-9153-4689-bbd7-1925cd53b343","Type":"ContainerStarted","Data":"3a36dd48ac206ec21f4409f01f12556f1ffcb5a523b2d58864e8c8059ab57fb6"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.130516 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerStarted","Data":"5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.130552 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerStarted","Data":"992aee5b0776d510c59718dbe65f51126e10a5ddde1021826a4cd33845179277"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.130921 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.131298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" event={"ID":"2688c073-5209-4258-a681-186370d9abcc","Type":"ContainerStarted","Data":"0514edf5057782dc147fcbf25c2e071a12af4f2b549922f7163f1f3bc0edaa4e"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.132421 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" event={"ID":"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc","Type":"ContainerStarted","Data":"cebb292829ee5a803948ac9659b48660b8fe5edfe2747823cec43da8907f3802"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.132444 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" event={"ID":"1c99aa5b-9126-4ff9-9931-c7d73b51a6dc","Type":"ContainerStarted","Data":"7492fef3c4c1f293e07fe42f7d3a7b16a15efd858458425883b9d912d63b20b8"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.134851 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" event={"ID":"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1","Type":"ContainerStarted","Data":"d65927f74531b4ae7237f192b19f24d9b771cf72d783cc16be3b30d05db2b676"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.135631 4983 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-df6gg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.135686 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.143948 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fp4l5" event={"ID":"d76474c2-7d5c-45a0-8869-d829b0c594d6","Type":"ContainerStarted","Data":"b3f67d7e32c58857716cb5db5b5ff53b1420e134f3f9b090f505ee73349d30f5"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.146072 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" event={"ID":"53e96ad5-bed1-4cf2-acf0-7f61294d16a7","Type":"ContainerStarted","Data":"a03594a8fe18f1d17dcd038ab4c4d231f097f012bb6b478888b19411d2d3e440"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.146663 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" event={"ID":"b6f53f35-efe7-4b1c-9a25-d82b824c156f","Type":"ContainerStarted","Data":"8731e4e180ee2daf698749d84e99c4ed8cdffe0f0fdfed703a8d018531ed2b36"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.150634 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" event={"ID":"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f","Type":"ContainerStarted","Data":"213fa260d6d2ab1c7363915f32ad9c9d84d0c738700cc118181e99da7f885c08"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.154108 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerStarted","Data":"de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.155346 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" event={"ID":"74d1b439-9506-4a1a-a1a4-3f5ca7944750","Type":"ContainerStarted","Data":"7773e05a45c023dfbaee250133ba3c8509f1eb88ed482e7416f7b66f9be6f92b"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.157380 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6j9qt" event={"ID":"211771ed-66f1-4866-b193-5da61bbd38b4","Type":"ContainerStarted","Data":"5c8492fec88f4b13618a9dbd6bd6da1904f6e125f5f134be0db02f38c23179ca"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.157398 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-6j9qt" event={"ID":"211771ed-66f1-4866-b193-5da61bbd38b4","Type":"ContainerStarted","Data":"4fbf47dbb17c4a5210d7461c6dede8a14a1f876ce7c9d7841cd12a9740c1ad64"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.157986 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.159393 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" event={"ID":"288bbae2-d98f-4e70-8f83-314c8a7a038b","Type":"ContainerStarted","Data":"73989f1f0cc2f3779005a964d470c9e62f00da4ba85bed4bbf78f1448df2d8aa"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.159412 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" event={"ID":"288bbae2-d98f-4e70-8f83-314c8a7a038b","Type":"ContainerStarted","Data":"90b48e07a8682d9e2d89981f9b4eca33656e2c6cd4e2fbb06d5f3e86a1ff6df3"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.162027 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.162051 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.162211 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" event={"ID":"5373e962-abd6-4153-9cc9-7d17b9ae5fe5","Type":"ContainerStarted","Data":"e67fd58081418d2850b7c6ee984c0827591c97f3998cc0c096962e658294485a"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.164172 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerStarted","Data":"b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb"} Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.164887 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.171540 4983 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9hbqr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.171592 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.176422 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.177917 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.677904367 +0000 UTC m=+208.278002797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.242670 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.277348 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.278156 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.77783689 +0000 UTC m=+208.377935320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.278348 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.279228 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.779212711 +0000 UTC m=+208.379311231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.348542 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.381478 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.382021 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.882003169 +0000 UTC m=+208.482101599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.382045 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.420817 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.433499 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qzvb8"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.437297 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.482179 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" podStartSLOduration=160.482162868 podStartE2EDuration="2m40.482162868s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.447891636 +0000 UTC m=+208.047990056" watchObservedRunningTime="2026-03-16 00:09:59.482162868 +0000 UTC m=+208.082261298" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.482605 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.482874 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:09:59.982862799 +0000 UTC m=+208.582961229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.537366 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podStartSLOduration=161.537348546 podStartE2EDuration="2m41.537348546s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.535796989 +0000 UTC m=+208.135895429" watchObservedRunningTime="2026-03-16 00:09:59.537348546 +0000 UTC m=+208.137446976" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.588312 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.588616 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.088601825 +0000 UTC m=+208.688700245 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: W0316 00:09:59.627056 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e46a85_c462_4ef3_a944_6ed47d2b0598.slice/crio-a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae WatchSource:0}: Error finding container a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae: Status 404 returned error can't find the container with id a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.645126 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xj2p" podStartSLOduration=160.645104122 podStartE2EDuration="2m40.645104122s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.607956303 +0000 UTC m=+208.208054743" watchObservedRunningTime="2026-03-16 00:09:59.645104122 +0000 UTC m=+208.245202552" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.689404 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.689678 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.189667982 +0000 UTC m=+208.789766412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.730911 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podStartSLOduration=160.730892632 podStartE2EDuration="2m40.730892632s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:09:59.725265184 +0000 UTC m=+208.325363614" watchObservedRunningTime="2026-03-16 00:09:59.730892632 +0000 UTC m=+208.330991062" Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.790910 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.791245 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.291219463 +0000 UTC m=+208.891317893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.843068 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d"] Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.843118 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49"] Mar 16 00:09:59 crc kubenswrapper[4983]: W0316 00:09:59.890549 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod143ccd96_ced1_466f_8891_72abc221bbac.slice/crio-dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8 WatchSource:0}: Error finding container dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8: Status 404 returned error can't find the container with id dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8 Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.892219 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.892493 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.392483195 +0000 UTC m=+208.992581625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:09:59 crc kubenswrapper[4983]: I0316 00:09:59.993323 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:09:59 crc kubenswrapper[4983]: E0316 00:09:59.996640 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.496591123 +0000 UTC m=+209.096689553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.045692 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-6j9qt" podStartSLOduration=161.045673888 podStartE2EDuration="2m41.045673888s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.044019148 +0000 UTC m=+208.644117578" watchObservedRunningTime="2026-03-16 00:10:00.045673888 +0000 UTC m=+208.645772318" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.100715 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.101362 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.601350209 +0000 UTC m=+209.201448639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.141136 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.141188 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lc9bv"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.141203 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5zxcb"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.164203 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-t4lj8"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.181771 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.194956 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.194982 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" event={"ID":"53e96ad5-bed1-4cf2-acf0-7f61294d16a7","Type":"ContainerStarted","Data":"353e6260b7eaf4cf83f83e7aff56c6a55ad8ab23277aed21c0f04df9c5a57ad2"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.195100 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" event={"ID":"b227bf69-003e-4831-8ce3-a5b1f7f85c31","Type":"ContainerStarted","Data":"8597cc81eadbdec63ceeb97b6aee7d39085f76d02923b96d9f85bf9188c36341"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.195115 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" event={"ID":"143ccd96-ced1-466f-8891-72abc221bbac","Type":"ContainerStarted","Data":"dbdeebfa567437983439c92b15646caa142fa64eb93fccd9e33b193485d6a2a8"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.195182 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.196056 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.196635 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" event={"ID":"5373e962-abd6-4153-9cc9-7d17b9ae5fe5","Type":"ContainerStarted","Data":"2db08a5909d98bd7ece8fd48bbbc264af1806dddff0a85c34295e46ad5d3bce6"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.202546 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.202681 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.702655413 +0000 UTC m=+209.302753843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.203043 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.203299 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.703290232 +0000 UTC m=+209.303388662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.204628 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.205567 4983 generic.go:334] "Generic (PLEG): container finished" podID="d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7" containerID="b0a22b729384eceaec70338f2664d98b4fd349a61381f59e8d8000d93e6ea45c" exitCode=0 Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.205659 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerDied","Data":"b0a22b729384eceaec70338f2664d98b4fd349a61381f59e8d8000d93e6ea45c"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.207950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" event={"ID":"31984625-3905-4d4d-9c52-e7d11c6c15d4","Type":"ContainerStarted","Data":"f4d420d9b34a817e92b62516fb78a06020b7ffa150dd9aef10049cb514baa18d"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.210694 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ml6pw"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.212878 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-82r5r" event={"ID":"55da5246-1df8-4666-ad7c-9407719b3abb","Type":"ContainerStarted","Data":"3bd5255589a4c515205e67bdf4f87eb0afab47190517d237d03c01ce52969d12"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.214202 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mjkh8" event={"ID":"211c2269-7173-4fcb-9403-be48b10ab364","Type":"ContainerStarted","Data":"6a33289fd77521cd37166fdf24cf77edbd7c3c31b54c359730b472bee23e39df"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.226279 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" event={"ID":"34398886-1821-47c0-bbff-951177287627","Type":"ContainerStarted","Data":"145e9218333e55acb9840b6fb949df13106ef40679a8fa644b1bc3725c1f8433"} Mar 16 00:10:00 crc kubenswrapper[4983]: W0316 00:10:00.228604 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249f0516_0237_4ba3_92eb_a7aa3b9c62c1.slice/crio-568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2 WatchSource:0}: Error finding container 568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2: Status 404 returned error can't find the container with id 568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2 Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.234496 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.235691 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w8qpq" event={"ID":"dfea0242-abc1-4912-a193-6c4dc75d9bb5","Type":"ContainerStarted","Data":"24d272d04c0560549b6742b323e69649413ed948f5548e408b9b1c261d2d388e"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.238503 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fp4l5" event={"ID":"d76474c2-7d5c-45a0-8869-d829b0c594d6","Type":"ContainerStarted","Data":"f1aed9c65d30ac039cadab304947e98f58048ca416d66ebdb897b989f91be90d"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.245979 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-n22z7"] Mar 16 00:10:00 crc kubenswrapper[4983]: W0316 00:10:00.248460 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ac1b1cc_8499_493f_a8d9_801eb433163f.slice/crio-8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0 WatchSource:0}: Error finding container 8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0: Status 404 returned error can't find the container with id 8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0 Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.248720 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" event={"ID":"6b0e4e23-a158-4597-b005-db088a652ec8","Type":"ContainerStarted","Data":"ed21a48833c174e4fcc13e352d4b1b175887dbf647f6297028b62e850ef69e92"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.253662 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerStarted","Data":"b9e245e332a00fe31e8a513f16d938a911b68f20bd84b7aa4a069280729c1f31"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.256098 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" event={"ID":"74d1b439-9506-4a1a-a1a4-3f5ca7944750","Type":"ContainerStarted","Data":"97b77baf4f8726300cf864bec54760df49ff9c95a4afffdfa6ee99e92da6566d"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.256782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerStarted","Data":"1965cf54da33760615e034ca9db488c5481e59caf0aa16831ccaefaf972dbc39"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.259602 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" event={"ID":"94e46a85-c462-4ef3-a944-6ed47d2b0598","Type":"ContainerStarted","Data":"a857e30f706d81c0ba7ca15373135d1e1ab1827dd2b9bc97b0f5352aef4c79ae"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.269816 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.272156 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" event={"ID":"aef72d9a-3e65-495f-8e73-ee539c10a29e","Type":"ContainerStarted","Data":"b5315a03863713f843c9b944f4b1c0c565229cc0e308c490f17a5db9d59c1391"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.273053 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52"] Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.285660 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" event={"ID":"b6f53f35-efe7-4b1c-9a25-d82b824c156f","Type":"ContainerStarted","Data":"493e177d5c682b97ceed9fadde908b79b08dde1bb0fbaa3eb23b0c4b5f72a635"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.293389 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" event={"ID":"ebd00ffd-95e2-47bf-a6fd-663526b2283d","Type":"ContainerStarted","Data":"e8c35420a3bab1fb4cd8c47471ef668f1a6989227fd8ee85480c5e541dd8e2ec"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.299473 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" event={"ID":"8662dd30-6a4c-4a3d-a3bb-8d24821241fa","Type":"ContainerStarted","Data":"5333ca9cc1a5370bfe2d231afed36f38b2926a11831f08aa215141d60e61c169"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.301121 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.303534 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.303749 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"auto-csr-approver-29560330-65dr5\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.303911 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.803847363 +0000 UTC m=+209.403945803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.304020 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.305676 4983 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-mw6rk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.305714 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" podUID="8662dd30-6a4c-4a3d-a3bb-8d24821241fa" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.306138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" event={"ID":"bcce228b-5abb-4cbb-8f79-57326a3a9665","Type":"ContainerStarted","Data":"2db792fdc99be701472444abc112040e2e138288e6176d6d440b992e7c9d893a"} Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.306322 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.806291366 +0000 UTC m=+209.406389836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.313470 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" event={"ID":"3dee21fa-f6c7-4ef6-a99d-21ad42acd3e1","Type":"ContainerStarted","Data":"ba869e3e342ba09bef033d10bd950525d19fc3c166e1837dfa0dd87b1cca26b3"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.323502 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.332468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerStarted","Data":"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.336898 4983 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9hbqr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.337137 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.338865 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" event={"ID":"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d","Type":"ContainerStarted","Data":"12cab49d37456bf346f64873c96c0b6f78f6ed32a45865a4e4e1d7e1b4d68a36"} Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.339710 4983 patch_prober.go:28] interesting pod/console-operator-58897d9998-lx4mf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.339765 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" podUID="ddcf712a-d77b-446c-b9e8-7083ff491d3c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.341221 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.341266 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.347990 4983 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-df6gg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.348033 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.405333 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.405724 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:00.905697063 +0000 UTC m=+209.505795493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.405787 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"auto-csr-approver-29560330-65dr5\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.450037 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"auto-csr-approver-29560330-65dr5\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.507654 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.509641 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.009607125 +0000 UTC m=+209.609705545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.515613 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9gcl" podStartSLOduration=162.515590743 podStartE2EDuration="2m42.515590743s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.465963532 +0000 UTC m=+209.066061962" watchObservedRunningTime="2026-03-16 00:10:00.515590743 +0000 UTC m=+209.115689183" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.553200 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.609526 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.609654 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.10962902 +0000 UTC m=+209.709727450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.609913 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.610206 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.110199327 +0000 UTC m=+209.710297757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.663063 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t65x6" podStartSLOduration=161.663047134 podStartE2EDuration="2m41.663047134s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.661452107 +0000 UTC m=+209.261550537" watchObservedRunningTime="2026-03-16 00:10:00.663047134 +0000 UTC m=+209.263145564" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.710735 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.710892 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.210864862 +0000 UTC m=+209.810963292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.711042 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.711340 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.211327845 +0000 UTC m=+209.811426275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.747018 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bjbs9" podStartSLOduration=162.74699948 podStartE2EDuration="2m42.74699948s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.746523666 +0000 UTC m=+209.346622096" watchObservedRunningTime="2026-03-16 00:10:00.74699948 +0000 UTC m=+209.347097910" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.812372 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.812971 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.312956549 +0000 UTC m=+209.913054979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.830105 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29560320-9tclx" podStartSLOduration=162.83008508 podStartE2EDuration="2m42.83008508s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.828970367 +0000 UTC m=+209.429068797" watchObservedRunningTime="2026-03-16 00:10:00.83008508 +0000 UTC m=+209.430183510" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.869500 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-fp4l5" podStartSLOduration=161.869485716 podStartE2EDuration="2m41.869485716s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.865333742 +0000 UTC m=+209.465432172" watchObservedRunningTime="2026-03-16 00:10:00.869485716 +0000 UTC m=+209.469584146" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.915107 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:00 crc kubenswrapper[4983]: E0316 00:10:00.915488 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.415477259 +0000 UTC m=+210.015575689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.958304 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-76l8x" podStartSLOduration=161.958288337 podStartE2EDuration="2m41.958288337s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.908513351 +0000 UTC m=+209.508611781" watchObservedRunningTime="2026-03-16 00:10:00.958288337 +0000 UTC m=+209.558386767" Mar 16 00:10:00 crc kubenswrapper[4983]: I0316 00:10:00.958610 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" podStartSLOduration=161.958605676 podStartE2EDuration="2m41.958605676s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:00.95707157 +0000 UTC m=+209.557170000" watchObservedRunningTime="2026-03-16 00:10:00.958605676 +0000 UTC m=+209.558704106" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.016100 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.016655 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.516639068 +0000 UTC m=+210.116737488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.016854 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.016941 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.017012 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.017082 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.018909 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.518891095 +0000 UTC m=+210.118989525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.022606 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.025296 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.026253 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.083386 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:10:01 crc kubenswrapper[4983]: W0316 00:10:01.101074 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc39b8480_5521_4ff7_b6ec_4f67009b1f5c.slice/crio-7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980 WatchSource:0}: Error finding container 7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980: Status 404 returned error can't find the container with id 7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980 Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.119333 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.119705 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.119892 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.119899 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.61987936 +0000 UTC m=+210.219977790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.124315 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6993dda4-ac10-47af-b406-d49d7781fbe5-metrics-certs\") pod \"network-metrics-daemon-qvtjp\" (UID: \"6993dda4-ac10-47af-b406-d49d7781fbe5\") " pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.129607 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.146716 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.159488 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.221053 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.221434 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.72141853 +0000 UTC m=+210.321516950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.322622 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.331741 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.831708482 +0000 UTC m=+210.431806912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.331878 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.332165 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.832155455 +0000 UTC m=+210.432253885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.348925 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" event={"ID":"22b9ac88-75ea-4572-bd27-f819caf4d8e2","Type":"ContainerStarted","Data":"db44784233b9ad9415444196abd7c2faf0178b2ea7e916574b985ac8f9ee8bdc"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.348965 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" event={"ID":"22b9ac88-75ea-4572-bd27-f819caf4d8e2","Type":"ContainerStarted","Data":"fc830bbf4e44f41be59d2028c77e60219664f9f57787783c802978e339a499c4"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.350207 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvtjp" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.359996 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" event={"ID":"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a","Type":"ContainerStarted","Data":"0afe989832c835fb78017edd78caeb4fb8bd8816c3ff94dd67dc89d5503b5838"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.377305 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" event={"ID":"b227bf69-003e-4831-8ce3-a5b1f7f85c31","Type":"ContainerStarted","Data":"d0fe8155e1db748bfd6f6a3c0969379dbdd3651f5523ef5689ebe917e0ce2b31"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.384025 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerStarted","Data":"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.384104 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerStarted","Data":"8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.385902 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-82r5r" event={"ID":"55da5246-1df8-4666-ad7c-9407719b3abb","Type":"ContainerStarted","Data":"de6be5c08ae54bf0f30e8f28268227d5ca4e161c02d894131f116068fe5fb4c7"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.400095 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" event={"ID":"2688c073-5209-4258-a681-186370d9abcc","Type":"ContainerStarted","Data":"9aa5f237308263393189ed9b77b9ee06c8bb53be68ab9fde971ad882c8563d6a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.426733 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-m5q8d" podStartSLOduration=162.426707197 podStartE2EDuration="2m42.426707197s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.418843663 +0000 UTC m=+210.018942093" watchObservedRunningTime="2026-03-16 00:10:01.426707197 +0000 UTC m=+210.026805647" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.427442 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.433087 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.433443 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.933426398 +0000 UTC m=+210.533524828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.433501 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.434177 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:01.93417005 +0000 UTC m=+210.534268480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.472743 4983 generic.go:334] "Generic (PLEG): container finished" podID="9c737bbb-9153-4689-bbd7-1925cd53b343" containerID="62141da8c2526e97aa1822a7f3f641b9ed6f2d142101bf4f93cbc2e90f300eea" exitCode=0 Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.472811 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" event={"ID":"9c737bbb-9153-4689-bbd7-1925cd53b343","Type":"ContainerDied","Data":"62141da8c2526e97aa1822a7f3f641b9ed6f2d142101bf4f93cbc2e90f300eea"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.486122 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerStarted","Data":"568845ee7ac2c267cff9d54987aa3c87ae9f7d3363c12766495c05454443fca2"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.503639 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-82r5r" podStartSLOduration=6.503620123 podStartE2EDuration="6.503620123s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.450554129 +0000 UTC m=+210.050652579" watchObservedRunningTime="2026-03-16 00:10:01.503620123 +0000 UTC m=+210.103718553" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.504381 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" event={"ID":"33ff8ce9-2d36-4251-ae9d-802d9965bfde","Type":"ContainerStarted","Data":"f43fc07ca791e8d3877807f155909f93329aa455340ee15d5dec03ee7b00c13d"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.511996 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560328-sngnj" event={"ID":"9da42bf3-da76-4db7-9653-f2f08567084f","Type":"ContainerStarted","Data":"fde617a4855b193426c3b4102e81b29ab0d3e6c44d90e708f2f6bda3bb35ebf8"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.515320 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" event={"ID":"aef72d9a-3e65-495f-8e73-ee539c10a29e","Type":"ContainerStarted","Data":"32982a122c4230e09d6194d06c3188d9b95d47f1e5ee640325ee4909c9ed26c5"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.517316 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerStarted","Data":"7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.519809 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerStarted","Data":"2f6ac418ab83db7361af1e5d0897d96c9e84cd20e3d27e7aa8176847f1f3a492"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.521599 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" event={"ID":"de84a408-0f98-48c6-83a5-e6976b576989","Type":"ContainerStarted","Data":"aeb14511e9d7abc3a665b28b92162b19c96ca2a289f5d44810a7c3dd5113a574"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.523774 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" event={"ID":"f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d","Type":"ContainerStarted","Data":"ff995c0679f94df40aa9c87e2609ad1cfe3da3323aee4535dcb64f151df65d57"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.524412 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.531316 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" event={"ID":"31984625-3905-4d4d-9c52-e7d11c6c15d4","Type":"ContainerStarted","Data":"7d17b520540e29ad54b4b7565c0ea7d810bd02ec7c86200a0cf535f85d956f87"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.532678 4983 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ql6v2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.532744 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" podUID="f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.535312 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.535812 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.035769003 +0000 UTC m=+210.635867453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.535999 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.536985 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" event={"ID":"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f","Type":"ContainerStarted","Data":"dbf20a3ef1ed3e47b6b6d1f462e21aac4f1f95b0cb81df81c7adac4f00b18da0"} Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.537420 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.037407462 +0000 UTC m=+210.637505942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.553279 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zh8f9" podStartSLOduration=162.553257165 podStartE2EDuration="2m42.553257165s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.532148275 +0000 UTC m=+210.132246715" watchObservedRunningTime="2026-03-16 00:10:01.553257165 +0000 UTC m=+210.153355595" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.553616 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" podStartSLOduration=162.553608635 podStartE2EDuration="2m42.553608635s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.552947105 +0000 UTC m=+210.153045545" watchObservedRunningTime="2026-03-16 00:10:01.553608635 +0000 UTC m=+210.153707065" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.566326 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" event={"ID":"34398886-1821-47c0-bbff-951177287627","Type":"ContainerStarted","Data":"e9babf5e0b9fb5ed9b0059cdd6c59bc430c9fae68c880bbfb9f80d6e711e2a4a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.576372 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-w8qpq" event={"ID":"dfea0242-abc1-4912-a193-6c4dc75d9bb5","Type":"ContainerStarted","Data":"42e4540e74db8d1e15507582abac8245193e5683a94a2cf1102c1a10a2a3265a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.591546 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qzvb8" podStartSLOduration=162.591525557 podStartE2EDuration="2m42.591525557s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.583463706 +0000 UTC m=+210.183562136" watchObservedRunningTime="2026-03-16 00:10:01.591525557 +0000 UTC m=+210.191623997" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.608550 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-w8qpq" podStartSLOduration=162.608529464 podStartE2EDuration="2m42.608529464s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.60805554 +0000 UTC m=+210.208153990" watchObservedRunningTime="2026-03-16 00:10:01.608529464 +0000 UTC m=+210.208627894" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.609354 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5zxcb" event={"ID":"9c413c46-e4ff-43f2-b66a-8a62e1f08890","Type":"ContainerStarted","Data":"2a7d1438a2f4b768ee017c4a995fbace90b444f599c615beb69ed4a8dbbf2535"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.609398 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5zxcb" event={"ID":"9c413c46-e4ff-43f2-b66a-8a62e1f08890","Type":"ContainerStarted","Data":"19f550fedf916378df938f466a725fe092180d7e9338e70c9778f0b22704bf3a"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.620074 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" event={"ID":"ebd00ffd-95e2-47bf-a6fd-663526b2283d","Type":"ContainerStarted","Data":"720cccf4750e93f0cac636b8f690f69530885b94b4b99cfe81b8d8691bb0ac1b"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.654306 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.655804 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.155787215 +0000 UTC m=+210.755885645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.688271 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" event={"ID":"d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7","Type":"ContainerStarted","Data":"cbad8360383812db201164e02dc84201e30bf6e665fac7ec7d9208decb400509"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.689018 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.716918 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vbdjp" podStartSLOduration=162.716904479 podStartE2EDuration="2m42.716904479s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.715288531 +0000 UTC m=+210.315386961" watchObservedRunningTime="2026-03-16 00:10:01.716904479 +0000 UTC m=+210.317002909" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.717382 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5zxcb" podStartSLOduration=6.717377743 podStartE2EDuration="6.717377743s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.654054213 +0000 UTC m=+210.254152643" watchObservedRunningTime="2026-03-16 00:10:01.717377743 +0000 UTC m=+210.317476173" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.746803 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" event={"ID":"5373e962-abd6-4153-9cc9-7d17b9ae5fe5","Type":"ContainerStarted","Data":"5fcb8f3e6330c0e897c5e39d3a96b822b82c1d6ad194d1ebe0d1b9e57e4887db"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.754901 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"e0dc80f80d5d7392c25a4334242663a0ba9cb67f7c9729fd71d5c1f6339b83bc"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.755716 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.757636 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.257623204 +0000 UTC m=+210.857721634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.760068 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" event={"ID":"94e46a85-c462-4ef3-a944-6ed47d2b0598","Type":"ContainerStarted","Data":"f41daa53721aa931e3c04682ddc91e1dc1bad874e3d3733d9fb034b95a99f1ad"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.762938 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mjkh8" event={"ID":"211c2269-7173-4fcb-9403-be48b10ab364","Type":"ContainerStarted","Data":"210f991cfbd379418f978742f9e5ee3dc1c4f7f781e87153dab509a41a256d8c"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.776696 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" event={"ID":"61000119-35ce-40ee-a8c5-5ad9052b539d","Type":"ContainerStarted","Data":"670f5a893936b8ad96a01d0355ddc5d7d757a88f85618efe4199c69a6ead4fd8"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.782722 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" event={"ID":"74d1b439-9506-4a1a-a1a4-3f5ca7944750","Type":"ContainerStarted","Data":"be8da38062686a4afdd070377ef603c986ba63707735ff708762f4c2c2a1bd61"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.788857 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" podStartSLOduration=162.788838676 podStartE2EDuration="2m42.788838676s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.787683221 +0000 UTC m=+210.387781651" watchObservedRunningTime="2026-03-16 00:10:01.788838676 +0000 UTC m=+210.388937106" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.804170 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" event={"ID":"143ccd96-ced1-466f-8891-72abc221bbac","Type":"ContainerStarted","Data":"6aef27281bf276883a364fa083d17698ca8634526f370fd890cb269151edb872"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.807854 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" event={"ID":"6b0e4e23-a158-4597-b005-db088a652ec8","Type":"ContainerStarted","Data":"22de531062765275223bfd19cc2f919cc47730fd4df219c2ba7c9232ff4ac956"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.820135 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.820185 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.820615 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerStarted","Data":"44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36"} Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.822139 4983 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9hbqr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.822279 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832683 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832748 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832865 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.832905 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.833018 4983 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-mw6rk container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.833051 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" podUID="8662dd30-6a4c-4a3d-a3bb-8d24821241fa" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.834283 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wp86n" podStartSLOduration=162.834249931 podStartE2EDuration="2m42.834249931s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.813130501 +0000 UTC m=+210.413228931" watchObservedRunningTime="2026-03-16 00:10:01.834249931 +0000 UTC m=+210.434348361" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.836105 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tj49l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.836143 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.845050 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-4vcp2" podStartSLOduration=162.845033853 podStartE2EDuration="2m42.845033853s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.839652943 +0000 UTC m=+210.439751373" watchObservedRunningTime="2026-03-16 00:10:01.845033853 +0000 UTC m=+210.445132283" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.856853 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.856969 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.356948599 +0000 UTC m=+210.957047029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.857169 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.859553 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.359537486 +0000 UTC m=+210.959635916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.916980 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-njztx" podStartSLOduration=162.916963259 podStartE2EDuration="2m42.916963259s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.87944251 +0000 UTC m=+210.479540940" watchObservedRunningTime="2026-03-16 00:10:01.916963259 +0000 UTC m=+210.517061679" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.931259 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podStartSLOduration=162.931243535 podStartE2EDuration="2m42.931243535s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.914837246 +0000 UTC m=+210.514935676" watchObservedRunningTime="2026-03-16 00:10:01.931243535 +0000 UTC m=+210.531341965" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.933543 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vzrlf" podStartSLOduration=162.933537154 podStartE2EDuration="2m42.933537154s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.932270386 +0000 UTC m=+210.532368816" watchObservedRunningTime="2026-03-16 00:10:01.933537154 +0000 UTC m=+210.533635574" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.956217 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xnx49" podStartSLOduration=162.95620014 podStartE2EDuration="2m42.95620014s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:01.954970774 +0000 UTC m=+210.555069204" watchObservedRunningTime="2026-03-16 00:10:01.95620014 +0000 UTC m=+210.556298570" Mar 16 00:10:01 crc kubenswrapper[4983]: I0316 00:10:01.958265 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:01 crc kubenswrapper[4983]: E0316 00:10:01.959625 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.459611962 +0000 UTC m=+211.059710392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.058346 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qvtjp"] Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.060035 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.060357 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.560345829 +0000 UTC m=+211.160444259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.161635 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.161991 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.661949721 +0000 UTC m=+211.262048151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.162119 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.162612 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.662599081 +0000 UTC m=+211.262697511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.262669 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.262992 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.762972886 +0000 UTC m=+211.363071316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.263210 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.263710 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.763701818 +0000 UTC m=+211.363800248 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.364770 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.364936 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.864915759 +0000 UTC m=+211.465014189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.365128 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.365450 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.865439355 +0000 UTC m=+211.465537785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.466998 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.467154 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.96712796 +0000 UTC m=+211.567226400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.467220 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.467571 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:02.967561973 +0000 UTC m=+211.567660403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.568701 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.568902 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.068876517 +0000 UTC m=+211.668974947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.569243 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.569526 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.069513276 +0000 UTC m=+211.669611706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.670002 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.670357 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.170342425 +0000 UTC m=+211.770440855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.770963 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.771307 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.271289468 +0000 UTC m=+211.871387918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.820935 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.820982 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.868976 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerStarted","Data":"81c835875b0da5ad00c0eef0ef68928bd1f88ad221a8ad83b11565521d53a877"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.872415 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.872869 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.3728494 +0000 UTC m=+211.972947830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.878885 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mjkh8" event={"ID":"211c2269-7173-4fcb-9403-be48b10ab364","Type":"ContainerStarted","Data":"cb4b2aacb2be069cb4c1685aaa72201857f54a52832e05eebd106f9c143a5354"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.879148 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.896891 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" event={"ID":"9c737bbb-9153-4689-bbd7-1925cd53b343","Type":"ContainerStarted","Data":"903a98ca3fc8090a6bba6e9a74bfd394cc0e3a734ad7457095bd40fd0923d09a"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.908022 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dcbe45123839f013b4b0ceffab4b0cee00aa8f3ec5219a7fb1dfc90d1e5eff99"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.908076 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"06ddd8beef35d8422a3e7f922e22433833d8360b1c8b3f1f7472018fccc21447"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.908677 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.915071 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" podStartSLOduration=164.915038779 podStartE2EDuration="2m44.915038779s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.892608779 +0000 UTC m=+211.492707209" watchObservedRunningTime="2026-03-16 00:10:02.915038779 +0000 UTC m=+211.515137209" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.943042 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mjkh8" podStartSLOduration=7.943019124 podStartE2EDuration="7.943019124s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.911893805 +0000 UTC m=+211.511992255" watchObservedRunningTime="2026-03-16 00:10:02.943019124 +0000 UTC m=+211.543117554" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.945748 4983 generic.go:334] "Generic (PLEG): container finished" podID="249f0516-0237-4ba3-92eb-a7aa3b9c62c1" containerID="6a5c37f8634f9377e518dee2138e5fcfd25e9fa89f065c718af9889083a0f35a" exitCode=0 Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.945859 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerDied","Data":"6a5c37f8634f9377e518dee2138e5fcfd25e9fa89f065c718af9889083a0f35a"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.957334 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" podStartSLOduration=163.957312451 podStartE2EDuration="2m43.957312451s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:02.956454175 +0000 UTC m=+211.556552605" watchObservedRunningTime="2026-03-16 00:10:02.957312451 +0000 UTC m=+211.557410881" Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.966778 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" event={"ID":"bb33b891-4cdb-4fc1-95e4-2895f40fdb7a","Type":"ContainerStarted","Data":"acdc4e189ccf4e761d6043513cbad293003048c67a648d207e8f75197775a5cf"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.985427 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:02 crc kubenswrapper[4983]: E0316 00:10:02.986555 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.486537803 +0000 UTC m=+212.086636303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.990192 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d3912b7b9d1bd2499676a4f638d603dc9dbee76b6a15b4e26df0827023be11e9"} Mar 16 00:10:02 crc kubenswrapper[4983]: I0316 00:10:02.990234 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5b0530f23d114cca92c1d71521e7fcfd5ec54179eba9c0dc7e38700236d16627"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.004703 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" event={"ID":"de84a408-0f98-48c6-83a5-e6976b576989","Type":"ContainerStarted","Data":"4337cad217a97d1f2298838954e7ca6bd588c7455c966b033c070741a57ef710"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.004741 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" event={"ID":"de84a408-0f98-48c6-83a5-e6976b576989","Type":"ContainerStarted","Data":"447747f164006bcfcf2a9ddf6febc450a45b097b5067b51e460ed9a461ff8370"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.004973 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.012094 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" event={"ID":"6b0e4e23-a158-4597-b005-db088a652ec8","Type":"ContainerStarted","Data":"a83be389e601cd0deeac5a22896d87007bfbd4fcb5c5277d680cd9b94bcebd7e"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.017143 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9ad31635a8bdd45a91fd6f1e45def5a319d6c3c06026587eb7989f6f76d2be8a"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.017190 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bd6198faed51abc42a3fd4706252c5b14a466a8b93f87fb8628ca9bd7e3c74a1"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.019841 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5pqgr" podStartSLOduration=164.019829506 podStartE2EDuration="2m44.019829506s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.014516668 +0000 UTC m=+211.614615098" watchObservedRunningTime="2026-03-16 00:10:03.019829506 +0000 UTC m=+211.619927936" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.035824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" event={"ID":"6993dda4-ac10-47af-b406-d49d7781fbe5","Type":"ContainerStarted","Data":"09c2019faf6c1afbd5f21bf9ee34e82dda5d63d96d1b396605aef92fe6ba01c0"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.035883 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" event={"ID":"6993dda4-ac10-47af-b406-d49d7781fbe5","Type":"ContainerStarted","Data":"ad1bdf4b5b5bf503936d824be0bbab593dff431a6c2e73a488599b99cc8935f7"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.041913 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" event={"ID":"2688c073-5209-4258-a681-186370d9abcc","Type":"ContainerStarted","Data":"33bfa229c8d0a420615966bd13f8a47649b744e9e51c7ff16e5d3678e3862f6a"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.071771 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" event={"ID":"3b3cc32d-4d8c-47ee-bf9c-2319482ab78f","Type":"ContainerStarted","Data":"601daed22a504039799d8c8241a370755a02a384e33777087b32aa2a5d19b047"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.082713 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.083081 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.090998 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.091095 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.591076433 +0000 UTC m=+212.191174863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.091242 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.093802 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.593791384 +0000 UTC m=+212.193889814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.095420 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" event={"ID":"61000119-35ce-40ee-a8c5-5ad9052b539d","Type":"ContainerStarted","Data":"17ad26d1a217c35c863e0de5f9a8aaeaa7505645e51dd84b4453328d77141812"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.095937 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.096053 4983 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-56ljn container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.096083 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" podUID="9c737bbb-9153-4689-bbd7-1925cd53b343" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.098835 4983 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-mhd52 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.098866 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" podUID="61000119-35ce-40ee-a8c5-5ad9052b539d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.110449 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-t4lj8" podStartSLOduration=164.110429641 podStartE2EDuration="2m44.110429641s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.101953588 +0000 UTC m=+211.702052018" watchObservedRunningTime="2026-03-16 00:10:03.110429641 +0000 UTC m=+211.710528071" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.140075 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" event={"ID":"22b9ac88-75ea-4572-bd27-f819caf4d8e2","Type":"ContainerStarted","Data":"8ce704030e1988d292db3e568f8d509b352df1c6490541930e4e6237b579a9fb"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.153113 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" event={"ID":"33ff8ce9-2d36-4251-ae9d-802d9965bfde","Type":"ContainerStarted","Data":"c97d7de6e52473a5f89c81863fac5338131db5dae8531818aeae4c774d1af7e8"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.153154 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" event={"ID":"33ff8ce9-2d36-4251-ae9d-802d9965bfde","Type":"ContainerStarted","Data":"f8a460c7ca84aeeb996af8f495c1120ff629101e9cc8957095906323e0a19a51"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.172553 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-l59k2" podStartSLOduration=165.172538464 podStartE2EDuration="2m45.172538464s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.170298918 +0000 UTC m=+211.770397358" watchObservedRunningTime="2026-03-16 00:10:03.172538464 +0000 UTC m=+211.772636894" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.173219 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" podStartSLOduration=164.173212695 podStartE2EDuration="2m44.173212695s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.13787538 +0000 UTC m=+211.737973810" watchObservedRunningTime="2026-03-16 00:10:03.173212695 +0000 UTC m=+211.773311125" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.176280 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" event={"ID":"31984625-3905-4d4d-9c52-e7d11c6c15d4","Type":"ContainerStarted","Data":"408ad5fdd3cf2f8905db91d37047012b541907ae6f543d36c294aebc0c9a0470"} Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.182716 4983 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-ql6v2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.182779 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" podUID="f2a01ee4-ddfb-428c-afc6-6bc0ff5af26d" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.183013 4983 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rvjb2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.187588 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.183262 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.183142 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tj49l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.187720 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.195273 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.196411 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.696389746 +0000 UTC m=+212.296488176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.202924 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qvtjp" podStartSLOduration=164.202908241 podStartE2EDuration="2m44.202908241s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.201398196 +0000 UTC m=+211.801496626" watchObservedRunningTime="2026-03-16 00:10:03.202908241 +0000 UTC m=+211.803006681" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.297399 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.303449 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.803432681 +0000 UTC m=+212.403531131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.321157 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" podStartSLOduration=164.32113926 podStartE2EDuration="2m44.32113926s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.303846054 +0000 UTC m=+211.903944484" watchObservedRunningTime="2026-03-16 00:10:03.32113926 +0000 UTC m=+211.921237680" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.381032 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" podStartSLOduration=164.381014257 podStartE2EDuration="2m44.381014257s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.369291817 +0000 UTC m=+211.969390247" watchObservedRunningTime="2026-03-16 00:10:03.381014257 +0000 UTC m=+211.981112687" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.399556 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.400109 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:03.900094116 +0000 UTC m=+212.500192546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.487418 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ml6pw" podStartSLOduration=164.487402792 podStartE2EDuration="2m44.487402792s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.487079533 +0000 UTC m=+212.087177963" watchObservedRunningTime="2026-03-16 00:10:03.487402792 +0000 UTC m=+212.087501212" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.501663 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.502016 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.002003828 +0000 UTC m=+212.602102258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.583559 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-9k8tn" podStartSLOduration=164.583542622 podStartE2EDuration="2m44.583542622s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.578377328 +0000 UTC m=+212.178475758" watchObservedRunningTime="2026-03-16 00:10:03.583542622 +0000 UTC m=+212.183641042" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.605859 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.606039 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.106014852 +0000 UTC m=+212.706113282 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.606106 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.607295 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.10728274 +0000 UTC m=+212.707381170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.651916 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6ppdt" podStartSLOduration=164.651897152 podStartE2EDuration="2m44.651897152s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.64681541 +0000 UTC m=+212.246913860" watchObservedRunningTime="2026-03-16 00:10:03.651897152 +0000 UTC m=+212.251995582" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.708840 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.709242 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.209225313 +0000 UTC m=+212.809323753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.813610 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.814002 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.31398607 +0000 UTC m=+212.914084500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.827986 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:03 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:03 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:03 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.828043 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:03 crc kubenswrapper[4983]: I0316 00:10:03.923807 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:03 crc kubenswrapper[4983]: E0316 00:10:03.924152 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.424137557 +0000 UTC m=+213.024235987 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.025566 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.025947 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.525937056 +0000 UTC m=+213.126035486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.127078 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.127349 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.627335212 +0000 UTC m=+213.227433642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.192669 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvtjp" event={"ID":"6993dda4-ac10-47af-b406-d49d7781fbe5","Type":"ContainerStarted","Data":"2831085478f698d01baf77ca8701f5dd942580b8bae8558e6ac9202616b0ff9d"} Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.206666 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerStarted","Data":"31366c6413075ef6992692d577cecf196661d413fe9923130b380f38340fd868"} Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.214981 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"0c2c1a0046a1102ba0d7e2abbd5c28bec7b5e06f88bf9eaf2b7e311fd297f03a"} Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218569 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-tj49l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218633 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218689 4983 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-np9wn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.218730 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" podUID="d09f2399-9ffb-4c0b-a4ac-f33bbd5186b7" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.228966 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.229510 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.729465331 +0000 UTC m=+213.329563761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.256117 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-ql6v2" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.296353 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-mhd52" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.308864 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xdst8" podStartSLOduration=165.3088476 podStartE2EDuration="2m45.3088476s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:03.684086483 +0000 UTC m=+212.284184913" watchObservedRunningTime="2026-03-16 00:10:04.3088476 +0000 UTC m=+212.908946020" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.330264 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.330410 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.830386773 +0000 UTC m=+213.430485203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.331013 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.334550 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.834532507 +0000 UTC m=+213.434631017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.432276 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.432435 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.932409938 +0000 UTC m=+213.532508368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.432648 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.432985 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:04.932971335 +0000 UTC m=+213.533069765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.534291 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.534410 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.034388412 +0000 UTC m=+213.634486842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.534513 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.534867 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.034855846 +0000 UTC m=+213.634954266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.576351 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.640336 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.640367 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.140348434 +0000 UTC m=+213.740446864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.640782 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.641092 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.141082566 +0000 UTC m=+213.741180996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.741724 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.741906 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.241882975 +0000 UTC m=+213.841981405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.741934 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.742291 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.242283397 +0000 UTC m=+213.842381827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.826250 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:04 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:04 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:04 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.826315 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.842930 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.843248 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.343160058 +0000 UTC m=+213.943258488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.843319 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.843699 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.343683503 +0000 UTC m=+213.943781933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.904440 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44210: no serving certificate available for the kubelet" Mar 16 00:10:04 crc kubenswrapper[4983]: I0316 00:10:04.945144 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:04 crc kubenswrapper[4983]: E0316 00:10:04.945413 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.445397609 +0000 UTC m=+214.045496039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.006354 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44216: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.046282 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.046680 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.546663672 +0000 UTC m=+214.146762092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.112647 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44224: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.147916 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.148232 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.648217373 +0000 UTC m=+214.248315793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.227919 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44226: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.229993 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" event={"ID":"249f0516-0237-4ba3-92eb-a7aa3b9c62c1","Type":"ContainerStarted","Data":"7b6580d427d37b425db9500c884957b95c83c13d7f5ab0a1b2e5388690548529"} Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.249732 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.250133 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.750121474 +0000 UTC m=+214.350219904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.325615 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44228: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.351475 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.352499 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.85248356 +0000 UTC m=+214.452581990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.457542 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.457901 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:05.957889116 +0000 UTC m=+214.557987536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.558839 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.559218 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.059202618 +0000 UTC m=+214.659301048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.577165 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44234: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.599120 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" podStartSLOduration=167.599102469 podStartE2EDuration="2m47.599102469s" podCreationTimestamp="2026-03-16 00:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:05.267226135 +0000 UTC m=+213.867324565" watchObservedRunningTime="2026-03-16 00:10:05.599102469 +0000 UTC m=+214.199200899" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.602093 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.602995 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.612029 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.629085 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.643081 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44248: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.672665 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.678788 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680261 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680309 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680337 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680355 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680395 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680411 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.680445 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.680687 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.180676204 +0000 UTC m=+214.780774634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.683242 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783259 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.783780 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.28373857 +0000 UTC m=+214.883837000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783866 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783917 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.783992 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784036 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784081 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784120 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784151 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.784499 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.284488833 +0000 UTC m=+214.884587263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784616 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.784964 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.785026 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.785150 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.789427 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.806957 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"community-operators-vxnxc\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.824680 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"certified-operators-hsgsl\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.827387 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:05 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:05 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:05 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.827438 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.862396 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.863004 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.872168 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.872366 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.887393 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44262: no serving certificate available for the kubelet" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.887453 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.888461 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.888843 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.889018 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.388998622 +0000 UTC m=+214.989097052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889608 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889654 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889692 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889779 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889822 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.889858 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:05 crc kubenswrapper[4983]: E0316 00:10:05.890382 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.390361153 +0000 UTC m=+214.990459583 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.899431 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.919917 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:10:05 crc kubenswrapper[4983]: I0316 00:10:05.931186 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002594 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002924 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002962 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.002997 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003023 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003054 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003562 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.003619 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.005224 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.50520209 +0000 UTC m=+215.105300520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.005460 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.015463 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.069742 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"community-operators-txzqn\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.086500 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.106787 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.107204 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.607187594 +0000 UTC m=+215.207286024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.128302 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.129195 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.129274 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.211316 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.211668 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.711652872 +0000 UTC m=+215.311751302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.211747 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.231069 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.324281 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.324644 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.324804 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.325167 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.340174 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.840146257 +0000 UTC m=+215.440244687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.427964 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.428215 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.428289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.428332 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.429136 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.429212 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:06.929197045 +0000 UTC m=+215.529295475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.429400 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.480783 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"certified-operators-sv5g7\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.518293 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.518486 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" containerID="cri-o://fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" gracePeriod=30 Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.529169 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.529504 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.029491459 +0000 UTC m=+215.629589879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.548142 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.550729 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.552057 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" containerID="cri-o://8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" gracePeriod=30 Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.611744 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44276: no serving certificate available for the kubelet" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.619044 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.630202 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.630719 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.130524794 +0000 UTC m=+215.730623224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: W0316 00:10:06.730274 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf617dbbc_f757_49b9_b8c6_7d0c07cb197e.slice/crio-560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b WatchSource:0}: Error finding container 560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b: Status 404 returned error can't find the container with id 560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.732043 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.732326 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.232314382 +0000 UTC m=+215.832412802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.757843 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.769102 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.829488 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:06 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:06 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:06 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.829848 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.837462 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.837818 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.337799161 +0000 UTC m=+215.937897591 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.842833 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.886164 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-np9wn" Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.888033 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:10:06 crc kubenswrapper[4983]: I0316 00:10:06.942106 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:06 crc kubenswrapper[4983]: E0316 00:10:06.942610 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.442594089 +0000 UTC m=+216.042692519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.030877 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.042886 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.043551 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.543533211 +0000 UTC m=+216.143631651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145449 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145634 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145663 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145702 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.145732 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") pod \"54a768f3-aa53-481d-b179-5c8807f69e89\" (UID: \"54a768f3-aa53-481d-b179-5c8807f69e89\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.146586 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config" (OuterVolumeSpecName: "config") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.146788 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca" (OuterVolumeSpecName: "client-ca") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.147054 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148470 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148825 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148838 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.148847 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54a768f3-aa53-481d-b179-5c8807f69e89-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.149079 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.649066621 +0000 UTC m=+216.249165051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.150257 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.158991 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9" (OuterVolumeSpecName: "kube-api-access-9zjz9") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "kube-api-access-9zjz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.179214 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "54a768f3-aa53-481d-b179-5c8807f69e89" (UID: "54a768f3-aa53-481d-b179-5c8807f69e89"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229161 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.229352 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229363 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.229372 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229378 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229468 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" containerName="controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229479 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerName="route-controller-manager" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.229803 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.248835 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249596 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249628 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249657 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249688 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249707 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.249726 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") pod \"8ac1b1cc-8499-493f-a8d9-801eb433163f\" (UID: \"8ac1b1cc-8499-493f-a8d9-801eb433163f\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.250143 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.250237 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.251580 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca" (OuterVolumeSpecName: "client-ca") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.252043 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config" (OuterVolumeSpecName: "config") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.252144 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.752124367 +0000 UTC m=+216.352222797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253366 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253483 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253549 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253597 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253684 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54a768f3-aa53-481d-b179-5c8807f69e89-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253699 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253712 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zjz9\" (UniqueName: \"kubernetes.io/projected/54a768f3-aa53-481d-b179-5c8807f69e89-kube-api-access-9zjz9\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253725 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ac1b1cc-8499-493f-a8d9-801eb433163f-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.253852 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.255958 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.755943631 +0000 UTC m=+216.356042061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.268283 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.268431 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h" (OuterVolumeSpecName: "kube-api-access-vg86h") pod "8ac1b1cc-8499-493f-a8d9-801eb433163f" (UID: "8ac1b1cc-8499-493f-a8d9-801eb433163f"). InnerVolumeSpecName "kube-api-access-vg86h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.274928 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.279206 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerStarted","Data":"5b6b697b43a3ca9aed435659be5a4adfa260345d670f3e9fc7b2402ed1c8de07"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.283643 4983 generic.go:334] "Generic (PLEG): container finished" podID="54a768f3-aa53-481d-b179-5c8807f69e89" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.283742 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.284235 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerDied","Data":"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.284328 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9hbqr" event={"ID":"54a768f3-aa53-481d-b179-5c8807f69e89","Type":"ContainerDied","Data":"b3aab4335c7ccfdf2161f33d50cd419255438398c3dcb373c9fee2523012c9eb"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.284389 4983 scope.go:117] "RemoveContainer" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.293524 4983 generic.go:334] "Generic (PLEG): container finished" podID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerID="1fc80a9e4fb01c05cb775f45190ece9037ca337a03452dd8abf5a08dd242d1da" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.293600 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"1fc80a9e4fb01c05cb775f45190ece9037ca337a03452dd8abf5a08dd242d1da"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.293624 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerStarted","Data":"560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.313853 4983 generic.go:334] "Generic (PLEG): container finished" podID="8ac1b1cc-8499-493f-a8d9-801eb433163f" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.313924 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerDied","Data":"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.313950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" event={"ID":"8ac1b1cc-8499-493f-a8d9-801eb433163f","Type":"ContainerDied","Data":"8665fea52c7fa59500f204d3f9650e9dc31f16bb6f517767fe49fb869cd3f2b0"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.314005 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.322055 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.322092 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"6ff5c36eac345013e6cc957efaa73b943a59621eebe35f0b43b2431024b1cecb"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.333013 4983 scope.go:117] "RemoveContainer" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.333722 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756\": container with ID starting with fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756 not found: ID does not exist" containerID="fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.333772 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756"} err="failed to get container status \"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756\": rpc error: code = NotFound desc = could not find container \"fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756\": container with ID starting with fa897511ec6823eee420e9d0f23379f68aa9864487bb833f6e1636729b910756 not found: ID does not exist" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.333799 4983 scope.go:117] "RemoveContainer" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.347313 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.347843 4983 generic.go:334] "Generic (PLEG): container finished" podID="0153d604-68c6-465e-9714-463f0e7e4c41" containerID="81c835875b0da5ad00c0eef0ef68928bd1f88ad221a8ad83b11565521d53a877" exitCode=0 Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.347913 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerDied","Data":"81c835875b0da5ad00c0eef0ef68928bd1f88ad221a8ad83b11565521d53a877"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360191 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360641 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360867 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360904 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360940 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360963 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.360983 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361001 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361022 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361044 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361072 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361117 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ac1b1cc-8499-493f-a8d9-801eb433163f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.361127 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg86h\" (UniqueName: \"kubernetes.io/projected/8ac1b1cc-8499-493f-a8d9-801eb433163f-kube-api-access-vg86h\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.361821 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.861806741 +0000 UTC m=+216.461905171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.363110 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.364109 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.367413 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9hbqr"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.367572 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerStarted","Data":"2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.368706 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerStarted","Data":"de21ac29d1b3f85746eecc6275790d886e43e62e160f35ab6e888afb27d08a5c"} Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.368973 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.369531 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.393564 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"controller-manager-76ff476bcc-pgmwb\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.411528 4983 scope.go:117] "RemoveContainer" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.412096 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013\": container with ID starting with 8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013 not found: ID does not exist" containerID="8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.412136 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013"} err="failed to get container status \"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013\": rpc error: code = NotFound desc = could not find container \"8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013\": container with ID starting with 8f8531cc72fd7bd34b673a648975b4757a377a891eda2a84d77672ca96d26013 not found: ID does not exist" Mar 16 00:10:07 crc kubenswrapper[4983]: W0316 00:10:07.414845 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6bd9bf5_fa59_4fef_9589_7b5865098bd2.slice/crio-aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e WatchSource:0}: Error finding container aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e: Status 404 returned error can't find the container with id aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.430502 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.432812 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rvjb2"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.461703 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.461984 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462021 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462045 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462074 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.462105 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:07.962091424 +0000 UTC m=+216.562189844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.462872 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.463178 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.476321 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.479796 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"route-controller-manager-74d65d8956-b8lr7\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.555999 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.563131 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.564269 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.064238843 +0000 UTC m=+216.664337283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.580572 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.636043 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.637639 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.642048 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665531 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665582 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665668 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.665705 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.666089 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.166072492 +0000 UTC m=+216.766170922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.678779 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.700279 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.704173 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.707514 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.708368 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.733738 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.760258 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767404 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767612 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.767715 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.267687965 +0000 UTC m=+216.867786395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767801 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767892 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.767923 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.768080 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.768113 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.768582 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.768831 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.268824619 +0000 UTC m=+216.868923049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.769311 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.799569 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"redhat-marketplace-b68d7\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.827512 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-lx4mf" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.827604 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.834578 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:07 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:07 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:07 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.834628 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.870199 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.870463 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.870504 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.871435 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.871874 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.371840484 +0000 UTC m=+216.971938914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881242 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881307 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881621 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.881635 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.892528 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.936625 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44280: no serving certificate available for the kubelet" Mar 16 00:10:07 crc kubenswrapper[4983]: I0316 00:10:07.971851 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:07 crc kubenswrapper[4983]: E0316 00:10:07.972275 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.472262511 +0000 UTC m=+217.072360941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.002342 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.017445 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.025143 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.051373 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.057958 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.058086 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.073585 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.074039 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.574020818 +0000 UTC m=+217.174119248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.106053 4983 patch_prober.go:28] interesting pod/console-f9d7485db-fp4l5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.106105 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fp4l5" podUID="d76474c2-7d5c-45a0-8869-d829b0c594d6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.115688 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a768f3-aa53-481d-b179-5c8807f69e89" path="/var/lib/kubelet/pods/54a768f3-aa53-481d-b179-5c8807f69e89/volumes" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.116665 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac1b1cc-8499-493f-a8d9-801eb433163f" path="/var/lib/kubelet/pods/8ac1b1cc-8499-493f-a8d9-801eb433163f/volumes" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.117811 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.117839 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.117937 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.132897 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-56ljn" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176652 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176720 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176770 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.176915 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.180404 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.680388693 +0000 UTC m=+217.280487193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279339 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279692 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.279858 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.280399 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.280519 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.780478531 +0000 UTC m=+217.380576961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.281256 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.304622 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"redhat-marketplace-kjc2w\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.380969 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.381464 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.881451834 +0000 UTC m=+217.481550264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.387987 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.407027 4983 generic.go:334] "Generic (PLEG): container finished" podID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerID="7455d52b296ac2dc05d5dba007a96face87721af18e58d348eedd55fbc4a2082" exitCode=0 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.407697 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"7455d52b296ac2dc05d5dba007a96face87721af18e58d348eedd55fbc4a2082"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.407750 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerStarted","Data":"aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.419102 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerStarted","Data":"750a85be2f0f629cc184ac0a4c018b832bba1ef5898acd3b3254238edafdcee9"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.430494 4983 generic.go:334] "Generic (PLEG): container finished" podID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerID="2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4" exitCode=0 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.430829 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.434524 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-mw6rk" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.437661 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.449279 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerStarted","Data":"f9b599001c13f639d451ceabf88f4a53c98624ba99597dffc75a2261d1939597"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.451589 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerStarted","Data":"75bf14131d5b8d3db0d67d7f812d7d6f097de077cb0e31a121d0d18e80488d4e"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.453651 4983 generic.go:334] "Generic (PLEG): container finished" podID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" exitCode=0 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.455131 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c"} Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.483358 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.484495 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:08.984473709 +0000 UTC m=+217.584572139 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.496024 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:10:08 crc kubenswrapper[4983]: W0316 00:10:08.540921 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbebf69d_773f_4829_a4ec_e443d52ef275.slice/crio-5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45 WatchSource:0}: Error finding container 5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45: Status 404 returned error can't find the container with id 5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45 Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.558072 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.580429 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.580403553 podStartE2EDuration="3.580403553s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:08.579181966 +0000 UTC m=+217.179280396" watchObservedRunningTime="2026-03-16 00:10:08.580403553 +0000 UTC m=+217.180501973" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.585120 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.585508 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.085496225 +0000 UTC m=+217.685594655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.640028 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.641356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.646036 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.647718 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.686989 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.687420 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.687482 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.687567 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.687878 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.18786254 +0000 UTC m=+217.787960970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788632 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788711 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788772 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.788820 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.789537 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.789818 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.790032 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.290021379 +0000 UTC m=+217.890119809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.811180 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"redhat-operators-56c2t\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.819937 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.822462 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:08 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:08 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:08 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.822503 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.839939 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.840007 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.853916 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.889776 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") pod \"0153d604-68c6-465e-9714-463f0e7e4c41\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.889846 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") pod \"0153d604-68c6-465e-9714-463f0e7e4c41\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.890178 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.890255 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") pod \"0153d604-68c6-465e-9714-463f0e7e4c41\" (UID: \"0153d604-68c6-465e-9714-463f0e7e4c41\") " Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.891038 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.391009933 +0000 UTC m=+217.991108353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.893078 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume" (OuterVolumeSpecName: "config-volume") pod "0153d604-68c6-465e-9714-463f0e7e4c41" (UID: "0153d604-68c6-465e-9714-463f0e7e4c41"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.897804 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q" (OuterVolumeSpecName: "kube-api-access-cx99q") pod "0153d604-68c6-465e-9714-463f0e7e4c41" (UID: "0153d604-68c6-465e-9714-463f0e7e4c41"). InnerVolumeSpecName "kube-api-access-cx99q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.897820 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0153d604-68c6-465e-9714-463f0e7e4c41" (UID: "0153d604-68c6-465e-9714-463f0e7e4c41"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.986833 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997236 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997362 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0153d604-68c6-465e-9714-463f0e7e4c41-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997376 4983 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0153d604-68c6-465e-9714-463f0e7e4c41-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:08 crc kubenswrapper[4983]: I0316 00:10:08.997386 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx99q\" (UniqueName: \"kubernetes.io/projected/0153d604-68c6-465e-9714-463f0e7e4c41-kube-api-access-cx99q\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:08 crc kubenswrapper[4983]: E0316 00:10:08.997687 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.497671567 +0000 UTC m=+218.097769997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.014649 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.060184 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.060390 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0153d604-68c6-465e-9714-463f0e7e4c41" containerName="collect-profiles" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.060404 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="0153d604-68c6-465e-9714-463f0e7e4c41" containerName="collect-profiles" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.060488 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="0153d604-68c6-465e-9714-463f0e7e4c41" containerName="collect-profiles" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.061329 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.072974 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.098897 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.101386 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.60135441 +0000 UTC m=+218.201452840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.101525 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.101662 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.101734 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203323 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203888 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.203927 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.204830 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.704810148 +0000 UTC m=+218.304908578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.205353 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.206553 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.229180 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"redhat-operators-7qx9g\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.304945 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.305136 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.805103422 +0000 UTC m=+218.405201852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.305322 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.305704 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.805688849 +0000 UTC m=+218.405787279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.321568 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.399374 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.406603 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.406730 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.906704644 +0000 UTC m=+218.506803064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.407033 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.407355 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:09.907347764 +0000 UTC m=+218.507446194 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.464248 4983 generic.go:334] "Generic (PLEG): container finished" podID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerID="f9b599001c13f639d451ceabf88f4a53c98624ba99597dffc75a2261d1939597" exitCode=0 Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.464323 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerDied","Data":"f9b599001c13f639d451ceabf88f4a53c98624ba99597dffc75a2261d1939597"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.471844 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerStarted","Data":"3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.472912 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.475177 4983 generic.go:334] "Generic (PLEG): container finished" podID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerID="b832baa9ad863d92bef0f4bd68918c75a656cd7a0c7e14efd5e15110ac3d6de8" exitCode=0 Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.475229 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"b832baa9ad863d92bef0f4bd68918c75a656cd7a0c7e14efd5e15110ac3d6de8"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.475248 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerStarted","Data":"5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.477277 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" event={"ID":"0153d604-68c6-465e-9714-463f0e7e4c41","Type":"ContainerDied","Data":"2f6ac418ab83db7361af1e5d0897d96c9e84cd20e3d27e7aa8176847f1f3a492"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.477307 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6ac418ab83db7361af1e5d0897d96c9e84cd20e3d27e7aa8176847f1f3a492" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.477381 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560320-mtrv4" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.496845 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerStarted","Data":"edfb4c106db9ff156e89258c7be736e143b651348ae2eece9c28a73c16f1a791"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.499075 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.506818 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerStarted","Data":"839c30c9cbe107a7c9f0dd7cc6175826e37c3a950a4d5a9be034e934974f0bc3"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.508838 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.509012 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.008980917 +0000 UTC m=+218.609079357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.509533 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.510693 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.010678608 +0000 UTC m=+218.610777118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.516737 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerStarted","Data":"6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.517573 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.521664 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerStarted","Data":"4e388f80539aba9aecacdcf41ea98fd1759f1315a9da531e5c2e5ed8b94369f5"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.521713 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerStarted","Data":"cfb360083edf9d08a20272d5f7ca0aec35055c4e7e0874048d95f64598422a3b"} Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.523122 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.542925 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podStartSLOduration=2.542901749 podStartE2EDuration="2.542901749s" podCreationTimestamp="2026-03-16 00:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:09.541773846 +0000 UTC m=+218.141872276" watchObservedRunningTime="2026-03-16 00:10:09.542901749 +0000 UTC m=+218.143000189" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.543680 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podStartSLOduration=2.543672732 podStartE2EDuration="2.543672732s" podCreationTimestamp="2026-03-16 00:10:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:09.521530162 +0000 UTC m=+218.121628592" watchObservedRunningTime="2026-03-16 00:10:09.543672732 +0000 UTC m=+218.143771162" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.610684 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.612509 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.112488526 +0000 UTC m=+218.712586966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.715636 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.716153 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.21613738 +0000 UTC m=+218.816235840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.816567 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.816997 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.31697737 +0000 UTC m=+218.917075800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.829222 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:09 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:09 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:09 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.829284 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.832443 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:10:09 crc kubenswrapper[4983]: I0316 00:10:09.917745 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:09 crc kubenswrapper[4983]: E0316 00:10:09.918148 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.418129699 +0000 UTC m=+219.018228129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.021050 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.021573 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.521541165 +0000 UTC m=+219.121639595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.114007 4983 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lc9bv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]log ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]etcd ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/generic-apiserver-start-informers ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/max-in-flight-filter ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 16 00:10:10 crc kubenswrapper[4983]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/project.openshift.io-projectcache ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 16 00:10:10 crc kubenswrapper[4983]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 16 00:10:10 crc kubenswrapper[4983]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 16 00:10:10 crc kubenswrapper[4983]: livez check failed Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.114079 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" podUID="249f0516-0237-4ba3-92eb-a7aa3b9c62c1" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.122101 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.122503 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.622490979 +0000 UTC m=+219.222589409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.223370 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.723350069 +0000 UTC m=+219.323448499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.223574 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.223849 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.224113 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.724105161 +0000 UTC m=+219.324203591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.325328 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.325520 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.825494498 +0000 UTC m=+219.425592928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.325633 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.325931 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.82591921 +0000 UTC m=+219.426017640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.426318 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.426503 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.926477962 +0000 UTC m=+219.526576392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.426898 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:10.926891384 +0000 UTC m=+219.526989814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.426952 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.443538 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mjkh8" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.517851 4983 ???:1] "http: TLS handshake error from 192.168.126.11:44292: no serving certificate available for the kubelet" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.527501 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.527696 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.027670322 +0000 UTC m=+219.627768752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.527806 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.528088 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.028076454 +0000 UTC m=+219.628174884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.532733 4983 generic.go:334] "Generic (PLEG): container finished" podID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" exitCode=0 Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.532821 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.535275 4983 generic.go:334] "Generic (PLEG): container finished" podID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerID="0601a98e47222baf45860438cfc29d0447fa64cf46cd7bead9a6ef97f07beb9c" exitCode=0 Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.535344 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"0601a98e47222baf45860438cfc29d0447fa64cf46cd7bead9a6ef97f07beb9c"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.537924 4983 generic.go:334] "Generic (PLEG): container finished" podID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerID="4e388f80539aba9aecacdcf41ea98fd1759f1315a9da531e5c2e5ed8b94369f5" exitCode=0 Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.538014 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerDied","Data":"4e388f80539aba9aecacdcf41ea98fd1759f1315a9da531e5c2e5ed8b94369f5"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.541363 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerStarted","Data":"66c02382f4884cf7432e8b1dd2d9aae721248d87c7cd3a1bce60e42991bb56c4"} Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.629948 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.630098 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.130077009 +0000 UTC m=+219.730175439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.630584 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.631744 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.131729308 +0000 UTC m=+219.731827738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.733772 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.734052 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.233964329 +0000 UTC m=+219.834062759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.734112 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.734464 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.234447794 +0000 UTC m=+219.834546224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.809922 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.823786 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:10 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:10 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.823850 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.835576 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.835872 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.335858951 +0000 UTC m=+219.935957381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.936455 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") pod \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.936861 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") pod \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\" (UID: \"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b\") " Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.936879 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" (UID: "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.937206 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:10 crc kubenswrapper[4983]: E0316 00:10:10.937627 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.437612558 +0000 UTC m=+220.037710988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.937935 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:10 crc kubenswrapper[4983]: I0316 00:10:10.942693 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" (UID: "1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.039459 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.039697 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.539654683 +0000 UTC m=+220.139753113 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.040103 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.040189 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.040553 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.54053988 +0000 UTC m=+220.140638360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.140895 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.141434 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.6413945 +0000 UTC m=+220.241492930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.242833 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.243405 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.743386334 +0000 UTC m=+220.343484774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.345131 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.345367 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.845338767 +0000 UTC m=+220.445437197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.345963 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.346412 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.846400969 +0000 UTC m=+220.446499399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.446732 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.447012 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.946980481 +0000 UTC m=+220.547078921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.447099 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.447367 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:11.947354822 +0000 UTC m=+220.547453252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.539886 4983 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.548839 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.549027 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.049005616 +0000 UTC m=+220.649104046 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.549064 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.550394 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.050376217 +0000 UTC m=+220.650474637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.553857 4983 generic.go:334] "Generic (PLEG): container finished" podID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerID="b9712062cb37f4ba2339e9dc2def8ff36e2a54d5fce9ebcc83e68db1e8c9e216" exitCode=0 Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.553935 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"b9712062cb37f4ba2339e9dc2def8ff36e2a54d5fce9ebcc83e68db1e8c9e216"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.562413 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"35c30c12a549c9f3a066c2d3d7362fbdedb473c53e36f73d0bb2b4532a71aa3e"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.562468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"6ed9f58a27e42d37ee961a0fda8db32a5fda0f9c1a37b58b3524532b2d28e46d"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.565729 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b","Type":"ContainerDied","Data":"5b6b697b43a3ca9aed435659be5a4adfa260345d670f3e9fc7b2402ed1c8de07"} Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.565775 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b6b697b43a3ca9aed435659be5a4adfa260345d670f3e9fc7b2402ed1c8de07" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.565890 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.650860 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.651479 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.151233637 +0000 UTC m=+220.751332067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.652273 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.653851 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.153837475 +0000 UTC m=+220.753935895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.753400 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.753603 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.253577262 +0000 UTC m=+220.853675692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.753650 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.754060 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.254046076 +0000 UTC m=+220.854144506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.820967 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:11 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:11 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:11 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.821050 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.854739 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.855154 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.355134713 +0000 UTC m=+220.955233143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:11 crc kubenswrapper[4983]: I0316 00:10:11.955830 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:11 crc kubenswrapper[4983]: E0316 00:10:11.956181 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.456166158 +0000 UTC m=+221.056264588 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.056816 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.056899 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.556873584 +0000 UTC m=+221.156972014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.057386 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.057795 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.557779061 +0000 UTC m=+221.157877491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.159902 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.160063 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.660036383 +0000 UTC m=+221.260134813 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.160240 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.160548 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.660538268 +0000 UTC m=+221.260636698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4sm6x" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.261688 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: E0316 00:10:12.261999 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-16 00:10:12.761984506 +0000 UTC m=+221.362082936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.310838 4983 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-16T00:10:11.539912035Z","Handler":null,"Name":""} Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.313048 4983 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.313074 4983 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.362665 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.365334 4983 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.365363 4983 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.413642 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4sm6x\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.463742 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.470386 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.540009 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.574447 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" event={"ID":"8820c8ae-e5d3-4c91-8724-ec666e783179","Type":"ContainerStarted","Data":"9ee5bb4119e3b16046ba33eca7ca88e39672de7857fa0ee6fe3cdfffeb59f2f3"} Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.605314 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-n22z7" podStartSLOduration=17.605289533 podStartE2EDuration="17.605289533s" podCreationTimestamp="2026-03-16 00:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:10:12.594535042 +0000 UTC m=+221.194633492" watchObservedRunningTime="2026-03-16 00:10:12.605289533 +0000 UTC m=+221.205387973" Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.821328 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:12 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:12 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:12 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:12 crc kubenswrapper[4983]: I0316 00:10:12.821408 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.230172 4983 ???:1] "http: TLS handshake error from 192.168.126.11:52122: no serving certificate available for the kubelet" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.820692 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:13 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:13 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:13 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.821179 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.844873 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:13 crc kubenswrapper[4983]: I0316 00:10:13.848765 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lc9bv" Mar 16 00:10:14 crc kubenswrapper[4983]: I0316 00:10:14.100698 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 16 00:10:14 crc kubenswrapper[4983]: I0316 00:10:14.820125 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:14 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:14 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:14 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:14 crc kubenswrapper[4983]: I0316 00:10:14.820176 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.658679 4983 ???:1] "http: TLS handshake error from 192.168.126.11:52126: no serving certificate available for the kubelet" Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.820597 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:15 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:15 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:15 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.821225 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:15 crc kubenswrapper[4983]: I0316 00:10:15.912652 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.015921 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") pod \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.016583 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") pod \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\" (UID: \"385ebee4-3c06-4dcb-89d4-999ba793a9ba\") " Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.016728 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "385ebee4-3c06-4dcb-89d4-999ba793a9ba" (UID: "385ebee4-3c06-4dcb-89d4-999ba793a9ba"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.017133 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.024889 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "385ebee4-3c06-4dcb-89d4-999ba793a9ba" (UID: "385ebee4-3c06-4dcb-89d4-999ba793a9ba"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.117969 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/385ebee4-3c06-4dcb-89d4-999ba793a9ba-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.609613 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"385ebee4-3c06-4dcb-89d4-999ba793a9ba","Type":"ContainerDied","Data":"cfb360083edf9d08a20272d5f7ca0aec35055c4e7e0874048d95f64598422a3b"} Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.609653 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfb360083edf9d08a20272d5f7ca0aec35055c4e7e0874048d95f64598422a3b" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.609676 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.832196 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:16 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:16 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:16 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:16 crc kubenswrapper[4983]: I0316 00:10:16.832270 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.821164 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:17 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:17 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:17 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.821253 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880187 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880220 4983 patch_prober.go:28] interesting pod/downloads-7954f5f757-6j9qt container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880232 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:17 crc kubenswrapper[4983]: I0316 00:10:17.880270 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-6j9qt" podUID="211771ed-66f1-4866-b193-5da61bbd38b4" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.096416 4983 patch_prober.go:28] interesting pod/console-f9d7485db-fp4l5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.096473 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fp4l5" podUID="d76474c2-7d5c-45a0-8869-d829b0c594d6" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.820398 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:18 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:18 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:18 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:18 crc kubenswrapper[4983]: I0316 00:10:18.820733 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:19 crc kubenswrapper[4983]: I0316 00:10:19.819877 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:19 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:19 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:19 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:19 crc kubenswrapper[4983]: I0316 00:10:19.819955 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:20 crc kubenswrapper[4983]: I0316 00:10:20.823211 4983 patch_prober.go:28] interesting pod/router-default-5444994796-w8qpq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 16 00:10:20 crc kubenswrapper[4983]: [-]has-synced failed: reason withheld Mar 16 00:10:20 crc kubenswrapper[4983]: [+]process-running ok Mar 16 00:10:20 crc kubenswrapper[4983]: healthz check failed Mar 16 00:10:20 crc kubenswrapper[4983]: I0316 00:10:20.823505 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-w8qpq" podUID="dfea0242-abc1-4912-a193-6c4dc75d9bb5" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 16 00:10:21 crc kubenswrapper[4983]: I0316 00:10:21.821434 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:21 crc kubenswrapper[4983]: I0316 00:10:21.824981 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-w8qpq" Mar 16 00:10:23 crc kubenswrapper[4983]: I0316 00:10:23.449171 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:10:23 crc kubenswrapper[4983]: I0316 00:10:23.449549 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.905191 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.905647 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" containerID="cri-o://3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8" gracePeriod=30 Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.919053 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:25 crc kubenswrapper[4983]: I0316 00:10:25.919248 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" containerID="cri-o://6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908" gracePeriod=30 Mar 16 00:10:26 crc kubenswrapper[4983]: E0316 00:10:26.754969 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 16 00:10:26 crc kubenswrapper[4983]: E0316 00:10:26.756661 4983 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:10:26 crc kubenswrapper[4983]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 16 00:10:26 crc kubenswrapper[4983]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gs7vq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29560328-sngnj_openshift-infra(9da42bf3-da76-4db7-9653-f2f08567084f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 16 00:10:26 crc kubenswrapper[4983]: > logger="UnhandledError" Mar 16 00:10:26 crc kubenswrapper[4983]: E0316 00:10:26.757804 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29560328-sngnj" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.557001 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.557074 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.581562 4983 patch_prober.go:28] interesting pod/route-controller-manager-74d65d8956-b8lr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.581631 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.685557 4983 generic.go:334] "Generic (PLEG): container finished" podID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerID="6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908" exitCode=0 Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.685656 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerDied","Data":"6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908"} Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.688205 4983 generic.go:334] "Generic (PLEG): container finished" podID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerID="3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8" exitCode=0 Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.688295 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerDied","Data":"3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8"} Mar 16 00:10:27 crc kubenswrapper[4983]: E0316 00:10:27.689884 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29560328-sngnj" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" Mar 16 00:10:27 crc kubenswrapper[4983]: I0316 00:10:27.891024 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-6j9qt" Mar 16 00:10:28 crc kubenswrapper[4983]: I0316 00:10:28.102348 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:28 crc kubenswrapper[4983]: I0316 00:10:28.106822 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-fp4l5" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.203310 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.203463 4983 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 16 00:10:29 crc kubenswrapper[4983]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 16 00:10:29 crc kubenswrapper[4983]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtp9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29560330-65dr5_openshift-infra(c39b8480-5521-4ff7-b6ec-4f67009b1f5c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 16 00:10:29 crc kubenswrapper[4983]: > logger="UnhandledError" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.204675 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29560330-65dr5" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" Mar 16 00:10:29 crc kubenswrapper[4983]: E0316 00:10:29.704150 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29560330-65dr5" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" Mar 16 00:10:36 crc kubenswrapper[4983]: I0316 00:10:36.159837 4983 ???:1] "http: TLS handshake error from 192.168.126.11:41342: no serving certificate available for the kubelet" Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.556850 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.556913 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.581490 4983 patch_prober.go:28] interesting pod/route-controller-manager-74d65d8956-b8lr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 16 00:10:37 crc kubenswrapper[4983]: I0316 00:10:37.581534 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 16 00:10:38 crc kubenswrapper[4983]: I0316 00:10:38.620966 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-hqnds" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895213 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:10:40 crc kubenswrapper[4983]: E0316 00:10:40.895817 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895834 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: E0316 00:10:40.895846 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895853 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895963 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="385ebee4-3c06-4dcb-89d4-999ba793a9ba" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.895978 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cbcc2b4-624d-4c26-bfcf-eb06aef3d77b" containerName="pruner" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.896398 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.901812 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.902155 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.915219 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.985261 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:40 crc kubenswrapper[4983]: I0316 00:10:40.985590 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.088825 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.088906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.089362 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.114497 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.222014 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:10:41 crc kubenswrapper[4983]: I0316 00:10:41.620232 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 16 00:10:45 crc kubenswrapper[4983]: E0316 00:10:45.476505 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3436131020/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:10:45 crc kubenswrapper[4983]: E0316 00:10:45.476926 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msk49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-56c2t_openshift-marketplace(8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3436131020/2\": happened during read: context canceled" logger="UnhandledError" Mar 16 00:10:45 crc kubenswrapper[4983]: E0316 00:10:45.478167 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3436131020/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-56c2t" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.490566 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.491610 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.496861 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.539246 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.539291 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.539413 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640530 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640633 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640661 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.640734 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.641007 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.659539 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"installer-9-crc\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:45 crc kubenswrapper[4983]: I0316 00:10:45.827072 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:10:46 crc kubenswrapper[4983]: E0316 00:10:46.685185 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:10:46 crc kubenswrapper[4983]: E0316 00:10:46.685348 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbbl2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vxnxc_openshift-marketplace(f617dbbc-f757-49b9-b8c6-7d0c07cb197e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:46 crc kubenswrapper[4983]: E0316 00:10:46.686684 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vxnxc" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.556432 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.556515 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.581164 4983 patch_prober.go:28] interesting pod/route-controller-manager-74d65d8956-b8lr7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:48 crc kubenswrapper[4983]: I0316 00:10:48.581229 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:50 crc kubenswrapper[4983]: E0316 00:10:50.021966 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vxnxc" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" Mar 16 00:10:50 crc kubenswrapper[4983]: E0316 00:10:50.022036 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-56c2t" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.073333 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101376 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101451 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101517 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.101537 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") pod \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\" (UID: \"5a30db24-326a-4f24-8ea0-e3d1367a2b76\") " Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.102664 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca" (OuterVolumeSpecName: "client-ca") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.102683 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config" (OuterVolumeSpecName: "config") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.107417 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.108033 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp" (OuterVolumeSpecName: "kube-api-access-mdbwp") pod "5a30db24-326a-4f24-8ea0-e3d1367a2b76" (UID: "5a30db24-326a-4f24-8ea0-e3d1367a2b76"). InnerVolumeSpecName "kube-api-access-mdbwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.109628 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:10:50 crc kubenswrapper[4983]: E0316 00:10:50.109869 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.109883 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.110032 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" containerName="route-controller-manager" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.110531 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.120524 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.202728 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203036 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203166 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203252 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203391 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a30db24-326a-4f24-8ea0-e3d1367a2b76-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203489 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203574 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdbwp\" (UniqueName: \"kubernetes.io/projected/5a30db24-326a-4f24-8ea0-e3d1367a2b76-kube-api-access-mdbwp\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.203636 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a30db24-326a-4f24-8ea0-e3d1367a2b76-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305174 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305259 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305367 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.305401 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.306593 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.309442 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.319735 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.339817 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"route-controller-manager-54dd5cd958-f2lqt\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.446134 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.846183 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" event={"ID":"5a30db24-326a-4f24-8ea0-e3d1367a2b76","Type":"ContainerDied","Data":"750a85be2f0f629cc184ac0a4c018b832bba1ef5898acd3b3254238edafdcee9"} Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.846231 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.846243 4983 scope.go:117] "RemoveContainer" containerID="6c0b025bd820fc741cfadf00ec9f111d65642da96c73116d19a68bdde2913908" Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.863844 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:50 crc kubenswrapper[4983]: I0316 00:10:50.867019 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-74d65d8956-b8lr7"] Mar 16 00:10:52 crc kubenswrapper[4983]: I0316 00:10:52.098303 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a30db24-326a-4f24-8ea0-e3d1367a2b76" path="/var/lib/kubelet/pods/5a30db24-326a-4f24-8ea0-e3d1367a2b76/volumes" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.448247 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.448534 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.448581 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.449128 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:10:53 crc kubenswrapper[4983]: I0316 00:10:53.449177 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383" gracePeriod=600 Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.068278 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.068458 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kmd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-b68d7_openshift-marketplace(cbebf69d-773f-4829-a4ec-e443d52ef275): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.069635 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-b68d7" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.083112 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.083235 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r8x8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kjc2w_openshift-marketplace(00a4a2a2-9263-4b76-8294-fa9c4d918fc7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:55 crc kubenswrapper[4983]: E0316 00:10:55.084366 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kjc2w" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.110034 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kjc2w" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.156369 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.175948 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.176159 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bqln2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-txzqn_openshift-marketplace(ca55ad69-3f41-4d0c-8f86-83a583ff6fe4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.178345 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-txzqn" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.189773 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:10:58 crc kubenswrapper[4983]: E0316 00:10:58.190079 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.190103 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.190228 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.190726 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.199658 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.205840 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206168 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206200 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206261 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.206444 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308341 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308435 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308488 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") pod \"99a90707-df7a-4c5f-9502-47f5eaafa320\" (UID: \"99a90707-df7a-4c5f-9502-47f5eaafa320\") " Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308649 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308688 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308706 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308731 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.308788 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.311723 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca" (OuterVolumeSpecName: "client-ca") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.311793 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config" (OuterVolumeSpecName: "config") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312380 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312666 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312838 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.312933 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.315411 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.315885 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.316125 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.317877 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5" (OuterVolumeSpecName: "kube-api-access-mbtv5") pod "99a90707-df7a-4c5f-9502-47f5eaafa320" (UID: "99a90707-df7a-4c5f-9502-47f5eaafa320"). InnerVolumeSpecName "kube-api-access-mbtv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.329195 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"controller-manager-678b5dcc8b-f4ncq\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.410669 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99a90707-df7a-4c5f-9502-47f5eaafa320-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411282 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411331 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411352 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99a90707-df7a-4c5f-9502-47f5eaafa320-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.411368 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbtv5\" (UniqueName: \"kubernetes.io/projected/99a90707-df7a-4c5f-9502-47f5eaafa320-kube-api-access-mbtv5\") on node \"crc\" DevicePath \"\"" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.509901 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.558082 4983 patch_prober.go:28] interesting pod/controller-manager-76ff476bcc-pgmwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.558173 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.893423 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" event={"ID":"99a90707-df7a-4c5f-9502-47f5eaafa320","Type":"ContainerDied","Data":"75bf14131d5b8d3db0d67d7f812d7d6f097de077cb0e31a121d0d18e80488d4e"} Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.893473 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76ff476bcc-pgmwb" Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.896011 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383" exitCode=0 Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.896134 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383"} Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.937223 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:10:58 crc kubenswrapper[4983]: I0316 00:10:58.939241 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76ff476bcc-pgmwb"] Mar 16 00:11:00 crc kubenswrapper[4983]: I0316 00:11:00.104535 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a90707-df7a-4c5f-9502-47f5eaafa320" path="/var/lib/kubelet/pods/99a90707-df7a-4c5f-9502-47f5eaafa320/volumes" Mar 16 00:11:00 crc kubenswrapper[4983]: E0316 00:11:00.622974 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:00 crc kubenswrapper[4983]: E0316 00:11:00.623548 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x4gpf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sv5g7_openshift-marketplace(b6bd9bf5-fa59-4fef-9589-7b5865098bd2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:00 crc kubenswrapper[4983]: E0316 00:11:00.625116 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sv5g7" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" Mar 16 00:11:01 crc kubenswrapper[4983]: E0316 00:11:01.059078 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 16 00:11:01 crc kubenswrapper[4983]: E0316 00:11:01.059247 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm28s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hsgsl_openshift-marketplace(8fd3d4ca-4839-4327-8121-fe6ba21051da): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:01 crc kubenswrapper[4983]: E0316 00:11:01.060437 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" Mar 16 00:11:04 crc kubenswrapper[4983]: E0316 00:11:04.973064 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sv5g7" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" Mar 16 00:11:04 crc kubenswrapper[4983]: E0316 00:11:04.975254 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-txzqn" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" Mar 16 00:11:04 crc kubenswrapper[4983]: E0316 00:11:04.975446 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" Mar 16 00:11:04 crc kubenswrapper[4983]: W0316 00:11:04.979926 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a099f86_8967_4361_bbbf_4dfa8385d2f2.slice/crio-e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f WatchSource:0}: Error finding container e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f: Status 404 returned error can't find the container with id e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.164438 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.164902 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8nr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-7qx9g_openshift-marketplace(7bc03354-3cba-40ac-a894-844d6ae1ee69): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.166119 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.272883 4983 scope.go:117] "RemoveContainer" containerID="3c6480b89b144d534c7c70c8010083a74e435b419d50a08c3805a800806b67d8" Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.560895 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.685286 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:05 crc kubenswrapper[4983]: W0316 00:11:05.703241 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4abd0e95_c153_4402_b4c0_447e8df8ef5e.slice/crio-ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b WatchSource:0}: Error finding container ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b: Status 404 returned error can't find the container with id ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.865621 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.921805 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.948409 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b93405e1-68b8-43ab-9628-cfd937aeca3f","Type":"ContainerStarted","Data":"ca9f3bcfd67825a8be572eb5e49d99ffdc8f464436504835a72dd955b5d125ac"} Mar 16 00:11:05 crc kubenswrapper[4983]: W0316 00:11:05.964419 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9820571e_90e5_4a57_925f_6dee047d6c9d.slice/crio-0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef WatchSource:0}: Error finding container 0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef: Status 404 returned error can't find the container with id 0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.980375 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerStarted","Data":"bb9e1da16a29be893a6a9d10f13e6b9a3bf25b7bf35da6d8f078d76e4ab8219e"} Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.988037 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerStarted","Data":"e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f"} Mar 16 00:11:05 crc kubenswrapper[4983]: I0316 00:11:05.989485 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerStarted","Data":"ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b"} Mar 16 00:11:05 crc kubenswrapper[4983]: E0316 00:11:05.997298 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" Mar 16 00:11:06 crc kubenswrapper[4983]: I0316 00:11:06.627203 4983 csr.go:261] certificate signing request csr-fkfcq is approved, waiting to be issued Mar 16 00:11:06 crc kubenswrapper[4983]: I0316 00:11:06.633443 4983 csr.go:257] certificate signing request csr-fkfcq is issued Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.000152 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerStarted","Data":"9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.001309 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerStarted","Data":"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.001434 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.002522 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerStarted","Data":"76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.004329 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerStarted","Data":"7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.004555 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.005930 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerStarted","Data":"29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.005957 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerStarted","Data":"0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.006792 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.008292 4983 generic.go:334] "Generic (PLEG): container finished" podID="9da42bf3-da76-4db7-9653-f2f08567084f" containerID="f1d9cd29662f3f229511dac637df41ff7b782921910c342dbfa3015d6466b383" exitCode=0 Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.008324 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560328-sngnj" event={"ID":"9da42bf3-da76-4db7-9653-f2f08567084f","Type":"ContainerDied","Data":"f1d9cd29662f3f229511dac637df41ff7b782921910c342dbfa3015d6466b383"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.011211 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.011260 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.011808 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.012783 4983 generic.go:334] "Generic (PLEG): container finished" podID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerID="71d1cd633bfe3af34262442e473b5136134787de19b07e78235e338e5e0f0440" exitCode=0 Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.012817 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b93405e1-68b8-43ab-9628-cfd937aeca3f","Type":"ContainerDied","Data":"71d1cd633bfe3af34262442e473b5136134787de19b07e78235e338e5e0f0440"} Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.020675 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=22.020658956 podStartE2EDuration="22.020658956s" podCreationTimestamp="2026-03-16 00:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.019178571 +0000 UTC m=+275.619277001" watchObservedRunningTime="2026-03-16 00:11:07.020658956 +0000 UTC m=+275.620757386" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.041055 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" podStartSLOduration=21.041033134 podStartE2EDuration="21.041033134s" podCreationTimestamp="2026-03-16 00:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.038051225 +0000 UTC m=+275.638149655" watchObservedRunningTime="2026-03-16 00:11:07.041033134 +0000 UTC m=+275.641131574" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.161662 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" podStartSLOduration=228.161645264 podStartE2EDuration="3m48.161645264s" podCreationTimestamp="2026-03-16 00:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.159539551 +0000 UTC m=+275.759637991" watchObservedRunningTime="2026-03-16 00:11:07.161645264 +0000 UTC m=+275.761743694" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.192539 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" podStartSLOduration=22.192523385 podStartE2EDuration="22.192523385s" podCreationTimestamp="2026-03-16 00:10:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:11:07.191095963 +0000 UTC m=+275.791194393" watchObservedRunningTime="2026-03-16 00:11:07.192523385 +0000 UTC m=+275.792621815" Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.635375 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-29 12:54:03.680179002 +0000 UTC Mar 16 00:11:07 crc kubenswrapper[4983]: I0316 00:11:07.635637 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6924h42m56.044544061s for next certificate rotation Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.021346 4983 generic.go:334] "Generic (PLEG): container finished" podID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerID="210bd7f5ab48e451b18cd186b0e612a0157714bee428a4d39d25cdd92c0f3eb0" exitCode=0 Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.021391 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"210bd7f5ab48e451b18cd186b0e612a0157714bee428a4d39d25cdd92c0f3eb0"} Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.023357 4983 generic.go:334] "Generic (PLEG): container finished" podID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerID="76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c" exitCode=0 Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.023445 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerDied","Data":"76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c"} Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.026451 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerStarted","Data":"4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf"} Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.341867 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.478589 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.489521 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.514360 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") pod \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\" (UID: \"c39b8480-5521-4ff7-b6ec-4f67009b1f5c\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.520007 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b" (OuterVolumeSpecName: "kube-api-access-wtp9b") pod "c39b8480-5521-4ff7-b6ec-4f67009b1f5c" (UID: "c39b8480-5521-4ff7-b6ec-4f67009b1f5c"). InnerVolumeSpecName "kube-api-access-wtp9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616053 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") pod \"9da42bf3-da76-4db7-9653-f2f08567084f\" (UID: \"9da42bf3-da76-4db7-9653-f2f08567084f\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616129 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") pod \"b93405e1-68b8-43ab-9628-cfd937aeca3f\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616215 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") pod \"b93405e1-68b8-43ab-9628-cfd937aeca3f\" (UID: \"b93405e1-68b8-43ab-9628-cfd937aeca3f\") " Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616293 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b93405e1-68b8-43ab-9628-cfd937aeca3f" (UID: "b93405e1-68b8-43ab-9628-cfd937aeca3f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616804 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtp9b\" (UniqueName: \"kubernetes.io/projected/c39b8480-5521-4ff7-b6ec-4f67009b1f5c-kube-api-access-wtp9b\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.616840 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b93405e1-68b8-43ab-9628-cfd937aeca3f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.619412 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b93405e1-68b8-43ab-9628-cfd937aeca3f" (UID: "b93405e1-68b8-43ab-9628-cfd937aeca3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.619679 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq" (OuterVolumeSpecName: "kube-api-access-gs7vq") pod "9da42bf3-da76-4db7-9653-f2f08567084f" (UID: "9da42bf3-da76-4db7-9653-f2f08567084f"). InnerVolumeSpecName "kube-api-access-gs7vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.635998 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 01:32:35.360868712 +0000 UTC Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.636026 4983 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6289h21m26.724844872s for next certificate rotation Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.718409 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs7vq\" (UniqueName: \"kubernetes.io/projected/9da42bf3-da76-4db7-9653-f2f08567084f-kube-api-access-gs7vq\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:08 crc kubenswrapper[4983]: I0316 00:11:08.718459 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b93405e1-68b8-43ab-9628-cfd937aeca3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.032950 4983 generic.go:334] "Generic (PLEG): container finished" podID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerID="4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf" exitCode=0 Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.033025 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.037348 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560328-sngnj" event={"ID":"9da42bf3-da76-4db7-9653-f2f08567084f","Type":"ContainerDied","Data":"fde617a4855b193426c3b4102e81b29ab0d3e6c44d90e708f2f6bda3bb35ebf8"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.037400 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fde617a4855b193426c3b4102e81b29ab0d3e6c44d90e708f2f6bda3bb35ebf8" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.037361 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560328-sngnj" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.043818 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b93405e1-68b8-43ab-9628-cfd937aeca3f","Type":"ContainerDied","Data":"ca9f3bcfd67825a8be572eb5e49d99ffdc8f464436504835a72dd955b5d125ac"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.043855 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca9f3bcfd67825a8be572eb5e49d99ffdc8f464436504835a72dd955b5d125ac" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.043871 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.045557 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerStarted","Data":"6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.048317 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560330-65dr5" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.048452 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560330-65dr5" event={"ID":"c39b8480-5521-4ff7-b6ec-4f67009b1f5c","Type":"ContainerDied","Data":"7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980"} Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.048505 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7852b21a7717c8a01be82d6d5cde8dd356e30ed41d3089cd4c321389eb11b980" Mar 16 00:11:09 crc kubenswrapper[4983]: I0316 00:11:09.088198 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vxnxc" podStartSLOduration=2.681238605 podStartE2EDuration="1m4.088178685s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="2026-03-16 00:10:07.303089218 +0000 UTC m=+215.903187638" lastFinishedPulling="2026-03-16 00:11:08.710029288 +0000 UTC m=+277.310127718" observedRunningTime="2026-03-16 00:11:09.083745213 +0000 UTC m=+277.683843643" watchObservedRunningTime="2026-03-16 00:11:09.088178685 +0000 UTC m=+277.688277115" Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:12.072912 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerStarted","Data":"ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f"} Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:12.091028 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56c2t" podStartSLOduration=3.839430655 podStartE2EDuration="1m4.09100568s" podCreationTimestamp="2026-03-16 00:10:08 +0000 UTC" firstStartedPulling="2026-03-16 00:10:10.53965046 +0000 UTC m=+219.139748890" lastFinishedPulling="2026-03-16 00:11:10.791225485 +0000 UTC m=+279.391323915" observedRunningTime="2026-03-16 00:11:12.090281348 +0000 UTC m=+280.690379778" watchObservedRunningTime="2026-03-16 00:11:12.09100568 +0000 UTC m=+280.691104110" Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:13.081177 4983 generic.go:334] "Generic (PLEG): container finished" podID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerID="b6acaa7dffa774e191a9bf342869bf819b4d039ee2bd145b14e03704f80e4abc" exitCode=0 Mar 16 00:11:13 crc kubenswrapper[4983]: I0316 00:11:13.081304 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"b6acaa7dffa774e191a9bf342869bf819b4d039ee2bd145b14e03704f80e4abc"} Mar 16 00:11:15 crc kubenswrapper[4983]: I0316 00:11:15.932686 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:15 crc kubenswrapper[4983]: I0316 00:11:15.932968 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:16 crc kubenswrapper[4983]: I0316 00:11:16.683614 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:16 crc kubenswrapper[4983]: I0316 00:11:16.783264 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:11:18 crc kubenswrapper[4983]: I0316 00:11:18.987953 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:18 crc kubenswrapper[4983]: I0316 00:11:18.988331 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:19 crc kubenswrapper[4983]: I0316 00:11:19.030739 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:19 crc kubenswrapper[4983]: I0316 00:11:19.162211 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:11:20 crc kubenswrapper[4983]: I0316 00:11:20.119950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerStarted","Data":"c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d"} Mar 16 00:11:20 crc kubenswrapper[4983]: I0316 00:11:20.140840 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b68d7" podStartSLOduration=4.222317305 podStartE2EDuration="1m13.140822982s" podCreationTimestamp="2026-03-16 00:10:07 +0000 UTC" firstStartedPulling="2026-03-16 00:10:10.54636499 +0000 UTC m=+219.146463420" lastFinishedPulling="2026-03-16 00:11:19.464870667 +0000 UTC m=+288.064969097" observedRunningTime="2026-03-16 00:11:20.138370329 +0000 UTC m=+288.738468759" watchObservedRunningTime="2026-03-16 00:11:20.140822982 +0000 UTC m=+288.740921412" Mar 16 00:11:21 crc kubenswrapper[4983]: I0316 00:11:21.126444 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf"} Mar 16 00:11:21 crc kubenswrapper[4983]: I0316 00:11:21.128034 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerStarted","Data":"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404"} Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.143565 4983 generic.go:334] "Generic (PLEG): container finished" podID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" exitCode=0 Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.143769 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf"} Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.146105 4983 generic.go:334] "Generic (PLEG): container finished" podID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" exitCode=0 Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.146152 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404"} Mar 16 00:11:22 crc kubenswrapper[4983]: I0316 00:11:22.546239 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:11:25 crc kubenswrapper[4983]: I0316 00:11:25.904904 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:25 crc kubenswrapper[4983]: I0316 00:11:25.905631 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" containerID="cri-o://29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a" gracePeriod=30 Mar 16 00:11:25 crc kubenswrapper[4983]: I0316 00:11:25.999873 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:26 crc kubenswrapper[4983]: I0316 00:11:26.000146 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" containerID="cri-o://7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a" gracePeriod=30 Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.176108 4983 generic.go:334] "Generic (PLEG): container finished" podID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerID="7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a" exitCode=0 Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.176280 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerDied","Data":"7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a"} Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.178370 4983 generic.go:334] "Generic (PLEG): container finished" podID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerID="29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a" exitCode=0 Mar 16 00:11:27 crc kubenswrapper[4983]: I0316 00:11:27.178395 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerDied","Data":"29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a"} Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.018176 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.018238 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.051894 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.223357 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.510821 4983 patch_prober.go:28] interesting pod/controller-manager-678b5dcc8b-f4ncq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" start-of-body= Mar 16 00:11:28 crc kubenswrapper[4983]: I0316 00:11:28.510874 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.63:8443/healthz\": dial tcp 10.217.0.63:8443: connect: connection refused" Mar 16 00:11:30 crc kubenswrapper[4983]: I0316 00:11:30.447442 4983 patch_prober.go:28] interesting pod/route-controller-manager-54dd5cd958-f2lqt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Mar 16 00:11:30 crc kubenswrapper[4983]: I0316 00:11:30.447507 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.883850 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.917341 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.917937 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.917954 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.917999 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918008 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.918021 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerName="pruner" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918029 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerName="pruner" Mar 16 00:11:33 crc kubenswrapper[4983]: E0316 00:11:33.918043 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918079 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918265 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918281 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93405e1-68b8-43ab-9628-cfd937aeca3f" containerName="pruner" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918290 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" containerName="oc" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.918328 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" containerName="route-controller-manager" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.919056 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:33 crc kubenswrapper[4983]: I0316 00:11:33.921117 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038499 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038892 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.038954 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") pod \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\" (UID: \"4abd0e95-c153-4402-b4c0-447e8df8ef5e\") " Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039287 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca" (OuterVolumeSpecName: "client-ca") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039450 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039562 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039604 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039631 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039726 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.039747 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config" (OuterVolumeSpecName: "config") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.044160 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.044855 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp" (OuterVolumeSpecName: "kube-api-access-fwzcp") pod "4abd0e95-c153-4402-b4c0-447e8df8ef5e" (UID: "4abd0e95-c153-4402-b4c0-447e8df8ef5e"). InnerVolumeSpecName "kube-api-access-fwzcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141736 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141863 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.141881 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.142000 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwzcp\" (UniqueName: \"kubernetes.io/projected/4abd0e95-c153-4402-b4c0-447e8df8ef5e-kube-api-access-fwzcp\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.142018 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4abd0e95-c153-4402-b4c0-447e8df8ef5e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.142030 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4abd0e95-c153-4402-b4c0-447e8df8ef5e-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.143543 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.143606 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.146210 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.170601 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"route-controller-manager-78f645c6d4-kwwjv\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233094 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233119 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" event={"ID":"4abd0e95-c153-4402-b4c0-447e8df8ef5e","Type":"ContainerDied","Data":"ce5fe3e5b0b512c532291202764fa14d31963aa82f8479c829465bb5e42e3a4b"} Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233183 4983 scope.go:117] "RemoveContainer" containerID="7e878b7ac1a94791722310700db59b2afda440870dd1e4bec364f135c03d7d6a" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.233254 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt" Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.257721 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:34 crc kubenswrapper[4983]: I0316 00:11:34.261618 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54dd5cd958-f2lqt"] Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.106350 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abd0e95-c153-4402-b4c0-447e8df8ef5e" path="/var/lib/kubelet/pods/4abd0e95-c153-4402-b4c0-447e8df8ef5e/volumes" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.245619 4983 generic.go:334] "Generic (PLEG): container finished" podID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerID="b9e245e332a00fe31e8a513f16d938a911b68f20bd84b7aa4a069280729c1f31" exitCode=0 Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.245665 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerDied","Data":"b9e245e332a00fe31e8a513f16d938a911b68f20bd84b7aa4a069280729c1f31"} Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.384838 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.410477 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:11:36 crc kubenswrapper[4983]: E0316 00:11:36.410723 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.410734 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.410847 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" containerName="controller-manager" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.411187 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.428772 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.468917 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.468965 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.468992 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.469103 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.469148 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570069 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570128 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570148 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570229 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570253 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") pod \"9820571e-90e5-4a57-925f-6dee047d6c9d\" (UID: \"9820571e-90e5-4a57-925f-6dee047d6c9d\") " Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570375 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570400 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570423 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570466 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.570512 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571210 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571491 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571615 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config" (OuterVolumeSpecName: "config") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571719 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571801 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca" (OuterVolumeSpecName: "client-ca") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.571986 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.574362 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt" (OuterVolumeSpecName: "kube-api-access-7lqdt") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "kube-api-access-7lqdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.574385 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9820571e-90e5-4a57-925f-6dee047d6c9d" (UID: "9820571e-90e5-4a57-925f-6dee047d6c9d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.577407 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.589604 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"controller-manager-b8585898d-qb964\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671695 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671797 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671835 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lqdt\" (UniqueName: \"kubernetes.io/projected/9820571e-90e5-4a57-925f-6dee047d6c9d-kube-api-access-7lqdt\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671859 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9820571e-90e5-4a57-925f-6dee047d6c9d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.671882 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9820571e-90e5-4a57-925f-6dee047d6c9d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:36 crc kubenswrapper[4983]: I0316 00:11:36.725534 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.255455 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" event={"ID":"9820571e-90e5-4a57-925f-6dee047d6c9d","Type":"ContainerDied","Data":"0f80f8d77d5e7d52546396d855090da3e1caada3b0ec5bf14e1494148b61e7ef"} Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.256992 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq" Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.320293 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.324992 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-678b5dcc8b-f4ncq"] Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.760596 4983 scope.go:117] "RemoveContainer" containerID="29ae933b927660608a2257d29f2db898ec60860e11279ee76b7c688737efe62a" Mar 16 00:11:37 crc kubenswrapper[4983]: I0316 00:11:37.998980 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.101409 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9820571e-90e5-4a57-925f-6dee047d6c9d" path="/var/lib/kubelet/pods/9820571e-90e5-4a57-925f-6dee047d6c9d/volumes" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.194199 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") pod \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.194739 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") pod \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\" (UID: \"9f5bd50b-b197-4deb-ac50-768e3baa6cff\") " Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.196200 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca" (OuterVolumeSpecName: "serviceca") pod "9f5bd50b-b197-4deb-ac50-768e3baa6cff" (UID: "9f5bd50b-b197-4deb-ac50-768e3baa6cff"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.205533 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6" (OuterVolumeSpecName: "kube-api-access-w74t6") pod "9f5bd50b-b197-4deb-ac50-768e3baa6cff" (UID: "9f5bd50b-b197-4deb-ac50-768e3baa6cff"). InnerVolumeSpecName "kube-api-access-w74t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.266627 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29560320-9tclx" event={"ID":"9f5bd50b-b197-4deb-ac50-768e3baa6cff","Type":"ContainerDied","Data":"de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce"} Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.266692 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de3be632dc1110954b83a945d8663b11e01d0d75623e9f6802f42d930bdec5ce" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.266647 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29560320-9tclx" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.295997 4983 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9f5bd50b-b197-4deb-ac50-768e3baa6cff-serviceca\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:38 crc kubenswrapper[4983]: I0316 00:11:38.296040 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w74t6\" (UniqueName: \"kubernetes.io/projected/9f5bd50b-b197-4deb-ac50-768e3baa6cff-kube-api-access-w74t6\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:42 crc kubenswrapper[4983]: I0316 00:11:42.591168 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:11:42 crc kubenswrapper[4983]: W0316 00:11:42.604501 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9cb240d_7329_47d5_89bd_d03b287f52c8.slice/crio-41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b WatchSource:0}: Error finding container 41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b: Status 404 returned error can't find the container with id 41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b Mar 16 00:11:42 crc kubenswrapper[4983]: I0316 00:11:42.659926 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:11:42 crc kubenswrapper[4983]: W0316 00:11:42.661985 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4adba1_ea9b_4255_9ae7_c311268a26f2.slice/crio-b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef WatchSource:0}: Error finding container b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef: Status 404 returned error can't find the container with id b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.306520 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerStarted","Data":"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.306560 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerStarted","Data":"41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.308588 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerStarted","Data":"5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.310077 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerStarted","Data":"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.310106 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerStarted","Data":"b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.312128 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerStarted","Data":"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.314945 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerStarted","Data":"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.316514 4983 generic.go:334] "Generic (PLEG): container finished" podID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerID="ff00e7152e69c4aeaaff4ebd02f8e9bc3011a8b0e33817b723307cd7fa5fe455" exitCode=0 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.316575 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"ff00e7152e69c4aeaaff4ebd02f8e9bc3011a8b0e33817b723307cd7fa5fe455"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.318145 4983 generic.go:334] "Generic (PLEG): container finished" podID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerID="de0cee5fa65ae8acc06500ed4f7bfd1b7fc45fe51327cba7b49afb9439e0134f" exitCode=0 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.318252 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"de0cee5fa65ae8acc06500ed4f7bfd1b7fc45fe51327cba7b49afb9439e0134f"} Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.358778 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-txzqn" podStartSLOduration=7.107563598 podStartE2EDuration="1m38.358740815s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="2026-03-16 00:10:08.469938906 +0000 UTC m=+217.070037336" lastFinishedPulling="2026-03-16 00:11:39.721116103 +0000 UTC m=+308.321214553" observedRunningTime="2026-03-16 00:11:43.356244611 +0000 UTC m=+311.956343051" watchObservedRunningTime="2026-03-16 00:11:43.358740815 +0000 UTC m=+311.958839235" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.942838 4983 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.943126 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerName="image-pruner" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943141 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerName="image-pruner" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943272 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5bd50b-b197-4deb-ac50-768e3baa6cff" containerName="image-pruner" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943685 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.943949 4983 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944273 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944307 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944326 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944339 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.944343 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4" gracePeriod=15 Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.945953 4983 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946224 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946301 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946331 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946343 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946358 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946371 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946385 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946397 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946436 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946447 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946464 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946475 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946495 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946504 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.946521 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946529 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946653 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946733 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946862 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946875 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946885 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946895 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.946908 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.947142 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947172 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: E0316 00:11:43.947185 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947193 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947310 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.947330 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961635 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961696 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961781 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961826 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961855 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961940 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961967 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.961988 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:43 crc kubenswrapper[4983]: I0316 00:11:43.984262 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067354 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067399 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067428 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067475 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067498 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067529 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067550 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067576 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067616 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067621 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067645 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067660 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067683 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.067688 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: E0316 00:11:44.234614 4983 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-podfd0d91b2_07e6_4d69_ba2d_a1abde0ff1ef.slice/crio-9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417.scope\": RecentStats: unable to find data in memory cache]" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.267737 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:11:44 crc kubenswrapper[4983]: W0316 00:11:44.310728 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0 WatchSource:0}: Error finding container 1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0: Status 404 returned error can't find the container with id 1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.323671 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1d34355322e05124292f9760bffc220151993c2d988dff3475114f2956c7f7b0"} Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.325696 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.326859 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327418 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327452 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327465 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327474 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c" exitCode=2 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.327490 4983 scope.go:117] "RemoveContainer" containerID="20456684150b50dae1e804bc6233c0dcdbcef917450be7b1c8f4e8f24de60abd" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.329190 4983 generic.go:334] "Generic (PLEG): container finished" podID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerID="5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.329214 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883"} Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.330243 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.330492 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.330525 4983 generic.go:334] "Generic (PLEG): container finished" podID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerID="9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417" exitCode=0 Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331163 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerDied","Data":"9fde2814949dd21f55871ee57d9c0de0a132a8749d55cb58695f078937d0a417"} Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331318 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331581 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331624 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331705 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.331863 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.332260 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.332843 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.333110 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.333381 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.333922 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.334222 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.334476 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.338510 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.338801 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.339041 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.339389 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.339678 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.340043 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.340783 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.341057 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.341412 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.341688 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.343303 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.343561 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.343861 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: I0316 00:11:44.344119 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:44 crc kubenswrapper[4983]: E0316 00:11:44.456313 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29e7f39b8734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,LastTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.338058 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerStarted","Data":"41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.339378 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.340402 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.340809 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341092 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341320 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341584 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341699 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerStarted","Data":"4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.341873 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.342345 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.342528 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.342794 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343119 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343402 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343671 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.343950 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerStarted","Data":"ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0"} Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.344077 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.344534 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.344853 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345095 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345376 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345638 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.345938 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.346222 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.346403 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.346547 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.626674 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.627431 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.627897 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.628360 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.628662 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.628928 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.629266 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.629521 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.629802 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.642416 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.642921 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643132 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643292 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643443 4983 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.643468 4983 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.643602 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="200ms" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687526 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") pod \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687609 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") pod \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687633 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") pod \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\" (UID: \"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef\") " Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687832 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock" (OuterVolumeSpecName: "var-lock") pod "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" (UID: "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.687870 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" (UID: "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.695970 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" (UID: "fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.708460 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29e7f39b8734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,LastTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.789020 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.789053 4983 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:45 crc kubenswrapper[4983]: I0316 00:11:45.789061 4983 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:45 crc kubenswrapper[4983]: E0316 00:11:45.844990 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="400ms" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.006132 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.006175 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.232878 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.233172 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:46 crc kubenswrapper[4983]: E0316 00:11:46.246545 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="800ms" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.274374 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.274975 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.275470 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.275936 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.276208 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.276463 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.276774 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.277093 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.277368 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.277659 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.354092 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.354731 4983 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc" exitCode=0 Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.355989 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.355999 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef","Type":"ContainerDied","Data":"bb9e1da16a29be893a6a9d10f13e6b9a3bf25b7bf35da6d8f078d76e4ab8219e"} Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.356048 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb9e1da16a29be893a6a9d10f13e6b9a3bf25b7bf35da6d8f078d76e4ab8219e" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.357722 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358058 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358246 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358442 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358628 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.358831 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.359019 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.359383 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.359574 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399071 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399267 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399433 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399600 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399797 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.399978 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.400160 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.400335 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.400517 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.758972 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.759072 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.803702 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.804267 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.804428 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.804696 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805163 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805407 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805556 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805698 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.805855 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.806193 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.851743 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.852814 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.853633 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.853937 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854204 4983 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854440 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854656 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.854898 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855116 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855313 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855597 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:46 crc kubenswrapper[4983]: I0316 00:11:46.855932 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020071 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020236 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020307 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020397 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020441 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020469 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020519 4983 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020535 4983 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.020547 4983 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.047587 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="1.6s" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.086686 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" probeResult="failure" output=< Mar 16 00:11:47 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:11:47 crc kubenswrapper[4983]: > Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.366034 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.366898 4983 scope.go:117] "RemoveContainer" containerID="dbeb163c90f1a627c2aaaff92a3faa35f8ebdf9e11c0c74ada59941276e60db4" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.367001 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.381008 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.382044 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.382715 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.383011 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.383348 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.383742 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.384085 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.385234 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.386140 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.386192 4983 scope.go:117] "RemoveContainer" containerID="a1d57b85a93eca15ac10ae10bcbd0b1911c3a385246d90310de18025d276ffc9" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.386416 4983 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.400172 4983 scope.go:117] "RemoveContainer" containerID="53939ebc5461c25cbc45f253f96e1082114837ea571ef6d4e3c65c394b23e252" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.414803 4983 scope.go:117] "RemoveContainer" containerID="64d50f3cd021279caf07f4a01a57d99f44c452fb4e5cb80037ad8d7086941c6c" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.429993 4983 scope.go:117] "RemoveContainer" containerID="094e4efea6c7c91c6b15cb690c53c25b1c7194e604131a39796b193b0006c6bc" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.433464 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[],\\\"sizeBytes\\\":1250173141},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3008c2e1161979da3569238dfcb92458c7bf2cfa54386b63c466812c99ff2497\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d71bca7600fcb53c6999817cf27c91d0a308793cafa2c95f1cae2bb7bee6f57\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221752025},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.433909 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434120 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434312 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434502 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:47 crc kubenswrapper[4983]: E0316 00:11:47.434526 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:11:47 crc kubenswrapper[4983]: I0316 00:11:47.454591 4983 scope.go:117] "RemoveContainer" containerID="3159b49866bd2f3cda27d20796d2ff38ce387ef27902b80f90f699c63182f719" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.105268 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.389423 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.389488 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.450990 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.451733 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.452527 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.453483 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.454133 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.454611 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.455159 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.455962 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.456857 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: I0316 00:11:48.457448 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:48 crc kubenswrapper[4983]: E0316 00:11:48.648862 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="3.2s" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.400566 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.401099 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.427741 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.428440 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.428931 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.429580 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.429874 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.430104 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.430389 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.430743 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.431086 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:49 crc kubenswrapper[4983]: I0316 00:11:49.431353 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:50 crc kubenswrapper[4983]: I0316 00:11:50.450511 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" probeResult="failure" output=< Mar 16 00:11:50 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:11:50 crc kubenswrapper[4983]: > Mar 16 00:11:51 crc kubenswrapper[4983]: E0316 00:11:51.850230 4983 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" interval="6.4s" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.094519 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.095178 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.095625 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.095974 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.096353 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.096717 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.097161 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.097384 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:52 crc kubenswrapper[4983]: I0316 00:11:52.097686 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:55 crc kubenswrapper[4983]: E0316 00:11:55.709705 4983 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.223:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189d29e7f39b8734 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,LastTimestamp:2026-03-16 00:11:44.455083828 +0000 UTC m=+313.055182258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.042867 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.043799 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.044078 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.044317 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.044670 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.045263 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.045505 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.045826 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.046131 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.046350 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.079430 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.080122 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.080534 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.080945 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.081213 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.081514 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.082081 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.082479 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.082873 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.083047 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.092141 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.092856 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093130 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093304 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093455 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093614 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093782 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.093940 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.094093 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.094246 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.117100 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.117138 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:56 crc kubenswrapper[4983]: E0316 00:11:56.117494 4983 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.117978 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.281779 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.282921 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283068 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283209 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283341 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283486 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283623 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283769 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.283926 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.284058 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.419201 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ac719b6d5974cdb29679bd18f99beb6f7ded826cf7f9bf7bcf8117d7fb98dc1b"} Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.796054 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.796566 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.796977 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.797235 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.797542 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.797785 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798019 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798264 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798514 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:56 crc kubenswrapper[4983]: I0316 00:11:56.798767 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427258 4983 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3862dea5b181d58336a593f482a4ce66f4ee8e743ace00ca62e0c6bde1865a68" exitCode=0 Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427301 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3862dea5b181d58336a593f482a4ce66f4ee8e743ace00ca62e0c6bde1865a68"} Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427591 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.427822 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.428291 4983 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.428402 4983 status_manager.go:851] "Failed to get status for pod" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" pod="openshift-marketplace/certified-operators-sv5g7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-sv5g7\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.428999 4983 status_manager.go:851] "Failed to get status for pod" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78f645c6d4-kwwjv\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.429298 4983 status_manager.go:851] "Failed to get status for pod" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.429628 4983 status_manager.go:851] "Failed to get status for pod" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" pod="openshift-marketplace/redhat-marketplace-kjc2w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-kjc2w\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.429970 4983 status_manager.go:851] "Failed to get status for pod" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" pod="openshift-marketplace/redhat-operators-7qx9g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-7qx9g\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.430394 4983 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.430655 4983 status_manager.go:851] "Failed to get status for pod" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" pod="openshift-marketplace/certified-operators-hsgsl" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-hsgsl\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.431067 4983 status_manager.go:851] "Failed to get status for pod" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" pod="openshift-marketplace/community-operators-txzqn" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-txzqn\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: I0316 00:11:57.431999 4983 status_manager.go:851] "Failed to get status for pod" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-b8585898d-qb964\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.477200 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-16T00:11:57Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[],\\\"sizeBytes\\\":1250173141},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:3008c2e1161979da3569238dfcb92458c7bf2cfa54386b63c466812c99ff2497\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:3d71bca7600fcb53c6999817cf27c91d0a308793cafa2c95f1cae2bb7bee6f57\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221752025},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.477856 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.478276 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.478633 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.479195 4983 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.223:6443: connect: connection refused" Mar 16 00:11:57 crc kubenswrapper[4983]: E0316 00:11:57.479237 4983 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.122485 4983 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.122553 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.438935 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.441171 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.441251 4983 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a" exitCode=1 Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.441340 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a"} Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.442156 4983 scope.go:117] "RemoveContainer" containerID="840aca2c27637244d187a66adf8d828641ee5c28d2b356c3f3665eb5f54cce9a" Mar 16 00:11:58 crc kubenswrapper[4983]: I0316 00:11:58.444651 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ead20f7bc8379f8c39f8f7828f1345adb9aaee42fa3767a1c38ff4c1c773f1fb"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.437382 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.453374 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.453974 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.454125 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3821a4b13ed2ef6f61a0d42fd64d93cd5287b1aa8d8d74d76819753dc4a0c27e"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458394 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22c13c0d9a35f0019afb4fe3c13d6f547188e5cb08665c79a453887c9d732baf"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458441 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5a31a9d6ffebdf1c7824d2abc6cfa47c386acdef3bc78af32325e775357ac172"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458455 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d98c834b2fc70b5f7985a629e8e583a7cd31e26523ec30d6130d9808bb0ea915"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a73adcf069e28dc6fb6a72e0659819c919b86830014a3543d4d382310d6b3334"} Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458800 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458910 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.458983 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:11:59 crc kubenswrapper[4983]: I0316 00:11:59.475459 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:12:01 crc kubenswrapper[4983]: I0316 00:12:01.119569 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:01 crc kubenswrapper[4983]: I0316 00:12:01.119869 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:01 crc kubenswrapper[4983]: I0316 00:12:01.126077 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:03 crc kubenswrapper[4983]: I0316 00:12:03.707947 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:03 crc kubenswrapper[4983]: I0316 00:12:03.713648 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:04 crc kubenswrapper[4983]: I0316 00:12:04.485521 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:04 crc kubenswrapper[4983]: I0316 00:12:04.757658 4983 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:04 crc kubenswrapper[4983]: I0316 00:12:04.989387 4983 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="85848ab8-d516-4d81-82d6-469eb041be6d" Mar 16 00:12:05 crc kubenswrapper[4983]: I0316 00:12:05.498476 4983 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:12:05 crc kubenswrapper[4983]: I0316 00:12:05.498830 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="a38880dd-7514-407a-af55-eff24eca32c5" Mar 16 00:12:05 crc kubenswrapper[4983]: I0316 00:12:05.501247 4983 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="85848ab8-d516-4d81-82d6-469eb041be6d" Mar 16 00:12:14 crc kubenswrapper[4983]: I0316 00:12:14.533744 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 16 00:12:15 crc kubenswrapper[4983]: I0316 00:12:15.700791 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 16 00:12:15 crc kubenswrapper[4983]: I0316 00:12:15.915655 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.024997 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.073939 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.141805 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.144915 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.251786 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.318108 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.457217 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.457896 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.567442 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.570886 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.673015 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.753024 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 16 00:12:16 crc kubenswrapper[4983]: I0316 00:12:16.906907 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.070778 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.189545 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.198304 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.341370 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.346534 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.502858 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.548162 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.638307 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.650812 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.750926 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.755549 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 16 00:12:17 crc kubenswrapper[4983]: I0316 00:12:17.987901 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.013206 4983 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.059636 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.080076 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.080775 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.193765 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.260813 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.320149 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.420002 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.518765 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.567135 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.612607 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.642245 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.646672 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.648103 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.668963 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.720651 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.908418 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.941068 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 16 00:12:18 crc kubenswrapper[4983]: I0316 00:12:18.957860 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.110558 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.116675 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.159509 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.191302 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.201032 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.267243 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.302653 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.324104 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.385136 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.557376 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.612543 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.665425 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.780498 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.800291 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.826814 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.988911 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 16 00:12:19 crc kubenswrapper[4983]: I0316 00:12:19.993991 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.004685 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.043981 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.083345 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.122365 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.146825 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.162919 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.187515 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.198860 4983 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.225846 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.235341 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.247329 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.250848 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.251217 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.294232 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.388055 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.537887 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.548632 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.571915 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.651727 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.683098 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.759527 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.771411 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.785223 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.847499 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.856161 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.895961 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.932902 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:12:20 crc kubenswrapper[4983]: I0316 00:12:20.997650 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.057256 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.066211 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.163482 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.168551 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.256898 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.312628 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.316453 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.336784 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.338691 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.341307 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.554561 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.574173 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.609878 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.637591 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.749853 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.757745 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.816040 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.817082 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 16 00:12:21 crc kubenswrapper[4983]: I0316 00:12:21.920027 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.013820 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.020370 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.107029 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.115025 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.153708 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.284190 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.298856 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.502074 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.545622 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.624605 4983 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.650421 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.698409 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.709962 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.714212 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.745505 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.745708 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.746378 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.776175 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.875846 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.961326 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:12:22 crc kubenswrapper[4983]: I0316 00:12:22.965586 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.131067 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.231179 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.393490 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.409951 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.411139 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.455506 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.463745 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.687400 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.707091 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.738820 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.751671 4983 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.753399 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qx9g" podStartSLOduration=45.522868203 podStartE2EDuration="2m14.753382621s" podCreationTimestamp="2026-03-16 00:10:09 +0000 UTC" firstStartedPulling="2026-03-16 00:10:15.872464838 +0000 UTC m=+224.472563268" lastFinishedPulling="2026-03-16 00:11:45.102979236 +0000 UTC m=+313.703077686" observedRunningTime="2026-03-16 00:12:04.85359295 +0000 UTC m=+333.453691380" watchObservedRunningTime="2026-03-16 00:12:23.753382621 +0000 UTC m=+352.353481071" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.753690 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" podStartSLOduration=57.753684959 podStartE2EDuration="57.753684959s" podCreationTimestamp="2026-03-16 00:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.802433092 +0000 UTC m=+333.402531522" watchObservedRunningTime="2026-03-16 00:12:23.753684959 +0000 UTC m=+352.353783389" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.754555 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.754548422 podStartE2EDuration="40.754548422s" podCreationTimestamp="2026-03-16 00:11:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.869174349 +0000 UTC m=+333.469272779" watchObservedRunningTime="2026-03-16 00:12:23.754548422 +0000 UTC m=+352.354646852" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.754668 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" podStartSLOduration=58.754663285 podStartE2EDuration="58.754663285s" podCreationTimestamp="2026-03-16 00:11:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:04.915313621 +0000 UTC m=+333.515412051" watchObservedRunningTime="2026-03-16 00:12:23.754663285 +0000 UTC m=+352.354761715" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.755124 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kjc2w" podStartSLOduration=44.06910464 podStartE2EDuration="2m15.755118247s" podCreationTimestamp="2026-03-16 00:10:08 +0000 UTC" firstStartedPulling="2026-03-16 00:10:10.53564224 +0000 UTC m=+219.135740670" lastFinishedPulling="2026-03-16 00:11:42.221655847 +0000 UTC m=+310.821754277" observedRunningTime="2026-03-16 00:12:04.833630832 +0000 UTC m=+333.433729262" watchObservedRunningTime="2026-03-16 00:12:23.755118247 +0000 UTC m=+352.355216677" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.755645 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sv5g7" podStartSLOduration=41.526657801 podStartE2EDuration="2m17.755638401s" podCreationTimestamp="2026-03-16 00:10:06 +0000 UTC" firstStartedPulling="2026-03-16 00:10:08.437563419 +0000 UTC m=+217.037661849" lastFinishedPulling="2026-03-16 00:11:44.666544019 +0000 UTC m=+313.266642449" observedRunningTime="2026-03-16 00:12:04.783068281 +0000 UTC m=+333.383166751" watchObservedRunningTime="2026-03-16 00:12:23.755638401 +0000 UTC m=+352.355736831" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.756207 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hsgsl" podStartSLOduration=42.350967706 podStartE2EDuration="2m18.756198127s" podCreationTimestamp="2026-03-16 00:10:05 +0000 UTC" firstStartedPulling="2026-03-16 00:10:08.43759261 +0000 UTC m=+217.037691040" lastFinishedPulling="2026-03-16 00:11:44.842823031 +0000 UTC m=+313.442921461" observedRunningTime="2026-03-16 00:12:04.887748819 +0000 UTC m=+333.487847249" watchObservedRunningTime="2026-03-16 00:12:23.756198127 +0000 UTC m=+352.356296557" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.756598 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.756641 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.760715 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.773178 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.775037 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.779726 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=19.77971496 podStartE2EDuration="19.77971496s" podCreationTimestamp="2026-03-16 00:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:23.779014311 +0000 UTC m=+352.379112751" watchObservedRunningTime="2026-03-16 00:12:23.77971496 +0000 UTC m=+352.379813390" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.816124 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.855573 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.867328 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:12:23 crc kubenswrapper[4983]: I0316 00:12:23.941374 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.033510 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.127698 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.144468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.222974 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.264998 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.308045 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.378238 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.444272 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.491197 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.612800 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.618010 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.633461 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.684659 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.698344 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:12:24 crc kubenswrapper[4983]: E0316 00:12:24.698629 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerName="installer" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.698653 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerName="installer" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.698802 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0d91b2-07e6-4d69-ba2d-a1abde0ff1ef" containerName="installer" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.699242 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.701713 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.701713 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.702245 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.714614 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.749043 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.765240 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.779918 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.821105 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.835888 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"auto-csr-approver-29560332-pflh5\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.883765 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.890257 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.904349 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.937873 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"auto-csr-approver-29560332-pflh5\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:24 crc kubenswrapper[4983]: I0316 00:12:24.961611 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"auto-csr-approver-29560332-pflh5\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.061649 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.123963 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.171576 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.202849 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.308773 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.341112 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.424109 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.480600 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.480854 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.530084 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.554647 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.565212 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.612368 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-pflh5" event={"ID":"90d9cc10-08aa-485e-a7cd-305a3e316c39","Type":"ContainerStarted","Data":"dc6b9b2afc9fc030b727abc83bb90ecd98dd4e195ab2eb513f9fb76ca8eb3fc0"} Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.627490 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.661448 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.719318 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.764035 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.810743 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.911538 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.911743 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" containerID="cri-o://a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" gracePeriod=30 Mar 16 00:12:25 crc kubenswrapper[4983]: I0316 00:12:25.933035 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.008604 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.009064 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" containerID="cri-o://3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" gracePeriod=30 Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.154385 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.209135 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.324247 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.373120 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.398340 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.406193 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.450185 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453482 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453536 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453600 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453644 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.453715 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") pod \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\" (UID: \"9b4adba1-ea9b-4255-9ae7-c311268a26f2\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.455050 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.455104 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config" (OuterVolumeSpecName: "config") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.456172 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.460336 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp" (OuterVolumeSpecName: "kube-api-access-sk2gp") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "kube-api-access-sk2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.461553 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b4adba1-ea9b-4255-9ae7-c311268a26f2" (UID: "9b4adba1-ea9b-4255-9ae7-c311268a26f2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.478919 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.493689 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.531881 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554367 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554404 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554456 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554503 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") pod \"d9cb240d-7329-47d5-89bd-d03b287f52c8\" (UID: \"d9cb240d-7329-47d5-89bd-d03b287f52c8\") " Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554729 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554742 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk2gp\" (UniqueName: \"kubernetes.io/projected/9b4adba1-ea9b-4255-9ae7-c311268a26f2-kube-api-access-sk2gp\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554767 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b4adba1-ea9b-4255-9ae7-c311268a26f2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554776 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.554784 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b4adba1-ea9b-4255-9ae7-c311268a26f2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.555452 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.556097 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config" (OuterVolumeSpecName: "config") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.557605 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z" (OuterVolumeSpecName: "kube-api-access-bm24z") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "kube-api-access-bm24z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.557909 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d9cb240d-7329-47d5-89bd-d03b287f52c8" (UID: "d9cb240d-7329-47d5-89bd-d03b287f52c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619103 4983 generic.go:334] "Generic (PLEG): container finished" podID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" exitCode=0 Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619182 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619211 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerDied","Data":"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619263 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv" event={"ID":"d9cb240d-7329-47d5-89bd-d03b287f52c8","Type":"ContainerDied","Data":"41ca8bb8ee852710cef23cf60c78041ce97f661d4205df66bc614cb80ff3bc4b"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.619280 4983 scope.go:117] "RemoveContainer" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.621888 4983 generic.go:334] "Generic (PLEG): container finished" podID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" exitCode=0 Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.621912 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerDied","Data":"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.621948 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" event={"ID":"9b4adba1-ea9b-4255-9ae7-c311268a26f2","Type":"ContainerDied","Data":"b87435187025d2237e0f17ddb78e0ba9a95e95ca7aa15ba629de1b087d9ffdef"} Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.622019 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b8585898d-qb964" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.636979 4983 scope.go:117] "RemoveContainer" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" Mar 16 00:12:26 crc kubenswrapper[4983]: E0316 00:12:26.637313 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9\": container with ID starting with 3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9 not found: ID does not exist" containerID="3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.637342 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9"} err="failed to get container status \"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9\": rpc error: code = NotFound desc = could not find container \"3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9\": container with ID starting with 3eb07dcbcb90ff3d72f8e3a9af2b0e3110b8db68dc28ad35be0a6a8bca79c7f9 not found: ID does not exist" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.637360 4983 scope.go:117] "RemoveContainer" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.647166 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.652030 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656048 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bm24z\" (UniqueName: \"kubernetes.io/projected/d9cb240d-7329-47d5-89bd-d03b287f52c8-kube-api-access-bm24z\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656083 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9cb240d-7329-47d5-89bd-d03b287f52c8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656099 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.656112 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d9cb240d-7329-47d5-89bd-d03b287f52c8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.658623 4983 scope.go:117] "RemoveContainer" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" Mar 16 00:12:26 crc kubenswrapper[4983]: E0316 00:12:26.659018 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef\": container with ID starting with a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef not found: ID does not exist" containerID="a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.659109 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef"} err="failed to get container status \"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef\": rpc error: code = NotFound desc = could not find container \"a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef\": container with ID starting with a45f9946236f02af1d7c87c408b4ae7b0f96500a43e627f091f00527997e2bef not found: ID does not exist" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.664460 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b8585898d-qb964"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.670717 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.675933 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f645c6d4-kwwjv"] Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.713717 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.758464 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.763891 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.766917 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.805248 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.817535 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.936628 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.943731 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.967071 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 16 00:12:26 crc kubenswrapper[4983]: I0316 00:12:26.978246 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.179808 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.281712 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.329779 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.399408 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.450946 4983 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.451206 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" gracePeriod=5 Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.486707 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.631949 4983 generic.go:334] "Generic (PLEG): container finished" podID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerID="0e3f6e1e6221d6bd922f567a1feb21e97e8062170d3d8a1f33f38076de2dd3b8" exitCode=0 Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.632133 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-pflh5" event={"ID":"90d9cc10-08aa-485e-a7cd-305a3e316c39","Type":"ContainerDied","Data":"0e3f6e1e6221d6bd922f567a1feb21e97e8062170d3d8a1f33f38076de2dd3b8"} Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.644861 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689430 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:27 crc kubenswrapper[4983]: E0316 00:12:27.689735 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689773 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: E0316 00:12:27.689787 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689797 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: E0316 00:12:27.689811 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689819 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689977 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" containerName="route-controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.689995 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.690010 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" containerName="controller-manager" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.690475 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.692353 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.693053 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.693560 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.693742 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.694625 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.694952 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.696604 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.697309 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.703671 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.703844 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704048 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704372 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704727 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.704906 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.706515 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.710098 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.712846 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.713983 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769205 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769247 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769267 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769300 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769364 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769395 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769412 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769435 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.769484 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.778493 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.799748 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870061 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870264 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870440 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870514 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870598 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870689 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870786 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870887 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.870982 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872313 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872614 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872825 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.872909 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.873157 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.875242 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.877484 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.908277 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"route-controller-manager-574fdb9957-kpx4z\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.910837 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"controller-manager-57bbb7d4d6-dmnkp\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.945383 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.954046 4983 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 16 00:12:27 crc kubenswrapper[4983]: I0316 00:12:27.998284 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.014790 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.024530 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.100521 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4adba1-ea9b-4255-9ae7-c311268a26f2" path="/var/lib/kubelet/pods/9b4adba1-ea9b-4255-9ae7-c311268a26f2/volumes" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.101297 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cb240d-7329-47d5-89bd-d03b287f52c8" path="/var/lib/kubelet/pods/d9cb240d-7329-47d5-89bd-d03b287f52c8/volumes" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.146151 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.170568 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.275793 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.316415 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.345803 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.398824 4983 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.402404 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.410557 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.435000 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.465484 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.490413 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.494651 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.639986 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerStarted","Data":"b9888d3c16b45058c14a5b2ded8f3ebdb76d157bd33693c09d4465398bc356f5"} Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.641661 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerStarted","Data":"a1d7dea939295229f9e48941df849399a44450bd71bed65af8d7bf28aa012cb3"} Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.856642 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.982620 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") pod \"90d9cc10-08aa-485e-a7cd-305a3e316c39\" (UID: \"90d9cc10-08aa-485e-a7cd-305a3e316c39\") " Mar 16 00:12:28 crc kubenswrapper[4983]: I0316 00:12:28.986973 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7" (OuterVolumeSpecName: "kube-api-access-k76k7") pod "90d9cc10-08aa-485e-a7cd-305a3e316c39" (UID: "90d9cc10-08aa-485e-a7cd-305a3e316c39"). InnerVolumeSpecName "kube-api-access-k76k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.046038 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.083645 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k76k7\" (UniqueName: \"kubernetes.io/projected/90d9cc10-08aa-485e-a7cd-305a3e316c39-kube-api-access-k76k7\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.196511 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.241833 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.296707 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.447065 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.483393 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.650749 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560332-pflh5" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.650684 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560332-pflh5" event={"ID":"90d9cc10-08aa-485e-a7cd-305a3e316c39","Type":"ContainerDied","Data":"dc6b9b2afc9fc030b727abc83bb90ecd98dd4e195ab2eb513f9fb76ca8eb3fc0"} Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.651380 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6b9b2afc9fc030b727abc83bb90ecd98dd4e195ab2eb513f9fb76ca8eb3fc0" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.652225 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerStarted","Data":"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877"} Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.655246 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.657904 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerStarted","Data":"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7"} Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.658904 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.660926 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.662939 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.670826 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" podStartSLOduration=4.670803316 podStartE2EDuration="4.670803316s" podCreationTimestamp="2026-03-16 00:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:29.668978877 +0000 UTC m=+358.269077307" watchObservedRunningTime="2026-03-16 00:12:29.670803316 +0000 UTC m=+358.270901776" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.714082 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" podStartSLOduration=3.714063391 podStartE2EDuration="3.714063391s" podCreationTimestamp="2026-03-16 00:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:29.712348695 +0000 UTC m=+358.312447125" watchObservedRunningTime="2026-03-16 00:12:29.714063391 +0000 UTC m=+358.314161831" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.796632 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 16 00:12:29 crc kubenswrapper[4983]: I0316 00:12:29.946822 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.050468 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.151957 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.291165 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.555337 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 16 00:12:30 crc kubenswrapper[4983]: I0316 00:12:30.670144 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.044519 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.077600 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.083013 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.194263 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.345134 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.513921 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.602071 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 16 00:12:31 crc kubenswrapper[4983]: I0316 00:12:31.703546 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.334564 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.421542 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.616153 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.616232 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.657956 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658000 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658014 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658048 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658072 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658120 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658159 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658193 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658234 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658586 4983 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658613 4983 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658632 4983 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.658648 4983 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.664981 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.675876 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.675925 4983 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" exitCode=137 Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.675969 4983 scope.go:117] "RemoveContainer" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.676003 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.712304 4983 scope.go:117] "RemoveContainer" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" Mar 16 00:12:32 crc kubenswrapper[4983]: E0316 00:12:32.713119 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6\": container with ID starting with bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6 not found: ID does not exist" containerID="bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.713172 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6"} err="failed to get container status \"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6\": rpc error: code = NotFound desc = could not find container \"bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6\": container with ID starting with bedd93399d023d1ac0e6b0dd4836e7d132bba8b280973ff9c762382469618be6 not found: ID does not exist" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.759688 4983 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:32 crc kubenswrapper[4983]: I0316 00:12:32.844191 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.105595 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.106153 4983 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.122078 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.122111 4983 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5353c607-9ed4-4276-9367-bcd7087a8af4" Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.130035 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 16 00:12:34 crc kubenswrapper[4983]: I0316 00:12:34.130107 4983 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5353c607-9ed4-4276-9367-bcd7087a8af4" Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.905279 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.906027 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" containerID="cri-o://d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" gracePeriod=30 Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.920787 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:45 crc kubenswrapper[4983]: I0316 00:12:45.920987 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" containerID="cri-o://49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" gracePeriod=30 Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.463286 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.513477 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543368 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543453 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543478 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543527 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543558 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543577 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543591 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") pod \"ba019949-b4c5-4df0-b625-32daf56cabec\" (UID: \"ba019949-b4c5-4df0-b625-32daf56cabec\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543620 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.543646 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") pod \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\" (UID: \"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa\") " Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.544951 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca" (OuterVolumeSpecName: "client-ca") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.544952 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca" (OuterVolumeSpecName: "client-ca") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.545048 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.545576 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config" (OuterVolumeSpecName: "config") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.545624 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config" (OuterVolumeSpecName: "config") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549065 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549111 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549204 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg" (OuterVolumeSpecName: "kube-api-access-wcpjg") pod "ba019949-b4c5-4df0-b625-32daf56cabec" (UID: "ba019949-b4c5-4df0-b625-32daf56cabec"). InnerVolumeSpecName "kube-api-access-wcpjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.549442 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t" (OuterVolumeSpecName: "kube-api-access-w248t") pod "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" (UID: "5915c9d6-caa5-4522-a2d0-9ebf1068a4fa"). InnerVolumeSpecName "kube-api-access-w248t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645258 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w248t\" (UniqueName: \"kubernetes.io/projected/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-kube-api-access-w248t\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645293 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba019949-b4c5-4df0-b625-32daf56cabec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645309 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcpjg\" (UniqueName: \"kubernetes.io/projected/ba019949-b4c5-4df0-b625-32daf56cabec-kube-api-access-wcpjg\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645320 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645332 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645342 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba019949-b4c5-4df0-b625-32daf56cabec-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645354 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645364 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.645376 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759073 4983 generic.go:334] "Generic (PLEG): container finished" podID="ba019949-b4c5-4df0-b625-32daf56cabec" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" exitCode=0 Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759122 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerDied","Data":"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759136 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759161 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp" event={"ID":"ba019949-b4c5-4df0-b625-32daf56cabec","Type":"ContainerDied","Data":"a1d7dea939295229f9e48941df849399a44450bd71bed65af8d7bf28aa012cb3"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.759181 4983 scope.go:117] "RemoveContainer" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765434 4983 generic.go:334] "Generic (PLEG): container finished" podID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" exitCode=0 Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765725 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerDied","Data":"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765900 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.765952 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z" event={"ID":"5915c9d6-caa5-4522-a2d0-9ebf1068a4fa","Type":"ContainerDied","Data":"b9888d3c16b45058c14a5b2ded8f3ebdb76d157bd33693c09d4465398bc356f5"} Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.781508 4983 scope.go:117] "RemoveContainer" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" Mar 16 00:12:46 crc kubenswrapper[4983]: E0316 00:12:46.785682 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877\": container with ID starting with d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877 not found: ID does not exist" containerID="d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.785889 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877"} err="failed to get container status \"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877\": rpc error: code = NotFound desc = could not find container \"d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877\": container with ID starting with d955842b54dcca61deb3011f3f6b7c6a401b07f29a9772d80c04663a8269c877 not found: ID does not exist" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.786164 4983 scope.go:117] "RemoveContainer" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.789157 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.795212 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-dmnkp"] Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.809279 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.811385 4983 scope.go:117] "RemoveContainer" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" Mar 16 00:12:46 crc kubenswrapper[4983]: E0316 00:12:46.811849 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7\": container with ID starting with 49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7 not found: ID does not exist" containerID="49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.811897 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7"} err="failed to get container status \"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7\": rpc error: code = NotFound desc = could not find container \"49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7\": container with ID starting with 49b337efff0f257a0d7392d0fc521e7f5df4f6d15758945a7fbb4f1de2e7b1f7 not found: ID does not exist" Mar 16 00:12:46 crc kubenswrapper[4983]: I0316 00:12:46.813647 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-kpx4z"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703246 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:12:47 crc kubenswrapper[4983]: E0316 00:12:47.703548 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703564 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: E0316 00:12:47.703576 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerName="oc" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703586 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerName="oc" Mar 16 00:12:47 crc kubenswrapper[4983]: E0316 00:12:47.703600 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703609 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703732 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" containerName="controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703748 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" containerName="route-controller-manager" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.703782 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" containerName="oc" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.704241 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706116 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706483 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706906 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.706901 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.707472 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.708123 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.708608 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.708692 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710181 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710407 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710789 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.710787 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.711519 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.715040 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.715277 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.716314 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.726517 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758189 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758243 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758270 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758295 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758345 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758376 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758411 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758435 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.758515 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.859595 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.859951 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.860066 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.860184 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861410 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861518 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861671 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861774 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.862180 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.861364 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.860434 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.862811 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.863109 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.864465 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.864916 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.864960 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.877594 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"controller-manager-78c886458b-c7whn\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:47 crc kubenswrapper[4983]: I0316 00:12:47.883075 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"route-controller-manager-8697489c76-cnkxm\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.025308 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.036071 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.147643 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5915c9d6-caa5-4522-a2d0-9ebf1068a4fa" path="/var/lib/kubelet/pods/5915c9d6-caa5-4522-a2d0-9ebf1068a4fa/volumes" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.149479 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba019949-b4c5-4df0-b625-32daf56cabec" path="/var/lib/kubelet/pods/ba019949-b4c5-4df0-b625-32daf56cabec/volumes" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.474862 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.480134 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:12:48 crc kubenswrapper[4983]: W0316 00:12:48.482246 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7705ce2_6b0a_4204_857b_b80448d4b201.slice/crio-3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563 WatchSource:0}: Error finding container 3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563: Status 404 returned error can't find the container with id 3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563 Mar 16 00:12:48 crc kubenswrapper[4983]: W0316 00:12:48.484124 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17c999f7_aab6_48d2_afe8_2c317c1825f5.slice/crio-876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a WatchSource:0}: Error finding container 876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a: Status 404 returned error can't find the container with id 876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.779728 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerStarted","Data":"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.779782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerStarted","Data":"876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.779893 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.781232 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerStarted","Data":"81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.781261 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerStarted","Data":"3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563"} Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.781452 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.792851 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.802485 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" podStartSLOduration=3.802465341 podStartE2EDuration="3.802465341s" podCreationTimestamp="2026-03-16 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:48.799980804 +0000 UTC m=+377.400079234" watchObservedRunningTime="2026-03-16 00:12:48.802465341 +0000 UTC m=+377.402563771" Mar 16 00:12:48 crc kubenswrapper[4983]: I0316 00:12:48.845908 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" podStartSLOduration=3.8458924100000003 podStartE2EDuration="3.84589241s" podCreationTimestamp="2026-03-16 00:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:12:48.841872672 +0000 UTC m=+377.441971102" watchObservedRunningTime="2026-03-16 00:12:48.84589241 +0000 UTC m=+377.445990840" Mar 16 00:12:49 crc kubenswrapper[4983]: I0316 00:12:49.212034 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:12:51 crc kubenswrapper[4983]: I0316 00:12:51.540486 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.376122 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.377793 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" containerID="cri-o://81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae" gracePeriod=30 Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.874733 4983 generic.go:334] "Generic (PLEG): container finished" podID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerID="81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae" exitCode=0 Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.875047 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerDied","Data":"81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae"} Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.875077 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" event={"ID":"c7705ce2-6b0a-4204-857b-b80448d4b201","Type":"ContainerDied","Data":"3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563"} Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.875091 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a2e8a3f8b5ab11423514bbab0ca39ce40a537d0189d18befabc970c0081b563" Mar 16 00:13:06 crc kubenswrapper[4983]: I0316 00:13:06.912776 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.095937 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096047 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096231 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096391 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.096459 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") pod \"c7705ce2-6b0a-4204-857b-b80448d4b201\" (UID: \"c7705ce2-6b0a-4204-857b-b80448d4b201\") " Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.097219 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.097324 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config" (OuterVolumeSpecName: "config") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.097578 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.103142 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp" (OuterVolumeSpecName: "kube-api-access-m2jlp") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "kube-api-access-m2jlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.103911 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7705ce2-6b0a-4204-857b-b80448d4b201" (UID: "c7705ce2-6b0a-4204-857b-b80448d4b201"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197522 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7705ce2-6b0a-4204-857b-b80448d4b201-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197573 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197587 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2jlp\" (UniqueName: \"kubernetes.io/projected/c7705ce2-6b0a-4204-857b-b80448d4b201-kube-api-access-m2jlp\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197625 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.197636 4983 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7705ce2-6b0a-4204-857b-b80448d4b201-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.716245 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8"] Mar 16 00:13:07 crc kubenswrapper[4983]: E0316 00:13:07.716689 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.716708 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.716887 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" containerName="controller-manager" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.717514 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.726417 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8"] Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.879944 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78c886458b-c7whn" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907093 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-client-ca\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907141 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27455c79-fbad-4784-9ca8-8280c3561064-serving-cert\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907171 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-config\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907215 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907231 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjld5\" (UniqueName: \"kubernetes.io/projected/27455c79-fbad-4784-9ca8-8280c3561064-kube-api-access-rjld5\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.907396 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:13:07 crc kubenswrapper[4983]: I0316 00:13:07.912830 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78c886458b-c7whn"] Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008175 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-client-ca\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008419 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27455c79-fbad-4784-9ca8-8280c3561064-serving-cert\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008450 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-config\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008502 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjld5\" (UniqueName: \"kubernetes.io/projected/27455c79-fbad-4784-9ca8-8280c3561064-kube-api-access-rjld5\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.008524 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.009485 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-client-ca\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.009780 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-proxy-ca-bundles\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.010486 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27455c79-fbad-4784-9ca8-8280c3561064-config\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.013333 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27455c79-fbad-4784-9ca8-8280c3561064-serving-cert\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.037376 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjld5\" (UniqueName: \"kubernetes.io/projected/27455c79-fbad-4784-9ca8-8280c3561064-kube-api-access-rjld5\") pod \"controller-manager-57bbb7d4d6-d2zd8\" (UID: \"27455c79-fbad-4784-9ca8-8280c3561064\") " pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.097949 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7705ce2-6b0a-4204-857b-b80448d4b201" path="/var/lib/kubelet/pods/c7705ce2-6b0a-4204-857b-b80448d4b201/volumes" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.333122 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.566156 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8"] Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.886716 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" event={"ID":"27455c79-fbad-4784-9ca8-8280c3561064","Type":"ContainerStarted","Data":"32f011957e1f8eccbea09f7766b3a6d671f89934a7e8fbccd2167a586bec10fb"} Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.887116 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" event={"ID":"27455c79-fbad-4784-9ca8-8280c3561064","Type":"ContainerStarted","Data":"070a7fb6ed9bd86aeb28bd34c2403bfa51a62ba1157341aad1702a3079f4691c"} Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.887572 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.893057 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" Mar 16 00:13:08 crc kubenswrapper[4983]: I0316 00:13:08.903141 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57bbb7d4d6-d2zd8" podStartSLOduration=2.903124467 podStartE2EDuration="2.903124467s" podCreationTimestamp="2026-03-16 00:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:08.902471569 +0000 UTC m=+397.502570009" watchObservedRunningTime="2026-03-16 00:13:08.903124467 +0000 UTC m=+397.503222897" Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.744077 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.744303 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sv5g7" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" containerID="cri-o://4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72" gracePeriod=2 Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.912155 4983 generic.go:334] "Generic (PLEG): container finished" podID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerID="4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72" exitCode=0 Mar 16 00:13:09 crc kubenswrapper[4983]: I0316 00:13:09.912215 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.196359 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.334308 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") pod \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.334474 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") pod \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.334528 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") pod \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\" (UID: \"b6bd9bf5-fa59-4fef-9589-7b5865098bd2\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.335259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities" (OuterVolumeSpecName: "utilities") pod "b6bd9bf5-fa59-4fef-9589-7b5865098bd2" (UID: "b6bd9bf5-fa59-4fef-9589-7b5865098bd2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.335956 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.336171 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-txzqn" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" containerID="cri-o://670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" gracePeriod=2 Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.345302 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf" (OuterVolumeSpecName: "kube-api-access-x4gpf") pod "b6bd9bf5-fa59-4fef-9589-7b5865098bd2" (UID: "b6bd9bf5-fa59-4fef-9589-7b5865098bd2"). InnerVolumeSpecName "kube-api-access-x4gpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.385220 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6bd9bf5-fa59-4fef-9589-7b5865098bd2" (UID: "b6bd9bf5-fa59-4fef-9589-7b5865098bd2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.436075 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.436110 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.436123 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gpf\" (UniqueName: \"kubernetes.io/projected/b6bd9bf5-fa59-4fef-9589-7b5865098bd2-kube-api-access-x4gpf\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.749977 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919614 4983 generic.go:334] "Generic (PLEG): container finished" podID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" exitCode=0 Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919682 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txzqn" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919714 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919776 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txzqn" event={"ID":"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4","Type":"ContainerDied","Data":"6ff5c36eac345013e6cc957efaa73b943a59621eebe35f0b43b2431024b1cecb"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.919798 4983 scope.go:117] "RemoveContainer" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.922267 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sv5g7" event={"ID":"b6bd9bf5-fa59-4fef-9589-7b5865098bd2","Type":"ContainerDied","Data":"aae8cc96a35a149bdefbed630e67440f7417544a5d2fbe7864d479595393b42e"} Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.922298 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sv5g7" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") pod \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941409 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") pod \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941536 4983 scope.go:117] "RemoveContainer" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.941585 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") pod \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\" (UID: \"ca55ad69-3f41-4d0c-8f86-83a583ff6fe4\") " Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.942559 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities" (OuterVolumeSpecName: "utilities") pod "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" (UID: "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.944438 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2" (OuterVolumeSpecName: "kube-api-access-bqln2") pod "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" (UID: "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4"). InnerVolumeSpecName "kube-api-access-bqln2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.962079 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.964140 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sv5g7"] Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.979382 4983 scope.go:117] "RemoveContainer" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.997650 4983 scope.go:117] "RemoveContainer" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" Mar 16 00:13:10 crc kubenswrapper[4983]: E0316 00:13:10.998478 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee\": container with ID starting with 670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee not found: ID does not exist" containerID="670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.998549 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee"} err="failed to get container status \"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee\": rpc error: code = NotFound desc = could not find container \"670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee\": container with ID starting with 670f8c8cd0368398530b1dd4f51f0f95c3ef257adee10cc0a48679e93d473bee not found: ID does not exist" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.998581 4983 scope.go:117] "RemoveContainer" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" Mar 16 00:13:10 crc kubenswrapper[4983]: E0316 00:13:10.999308 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf\": container with ID starting with 0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf not found: ID does not exist" containerID="0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999341 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf"} err="failed to get container status \"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf\": rpc error: code = NotFound desc = could not find container \"0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf\": container with ID starting with 0d9ad17d4eb970a20a6c0ffd10f26227f77d7e3682c21950d6cdbd001aa44bbf not found: ID does not exist" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999366 4983 scope.go:117] "RemoveContainer" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" Mar 16 00:13:10 crc kubenswrapper[4983]: E0316 00:13:10.999811 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c\": container with ID starting with 10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c not found: ID does not exist" containerID="10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999860 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c"} err="failed to get container status \"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c\": rpc error: code = NotFound desc = could not find container \"10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c\": container with ID starting with 10eedc480d5e9e3e1e33b24fa8b1922bce3946513f9ea8ee2866978a27061a4c not found: ID does not exist" Mar 16 00:13:10 crc kubenswrapper[4983]: I0316 00:13:10.999892 4983 scope.go:117] "RemoveContainer" containerID="4e33c51822af1207a714633dbe36f0cfbe87f71ecdd87a6317177017c49cda72" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.001373 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" (UID: "ca55ad69-3f41-4d0c-8f86-83a583ff6fe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.012555 4983 scope.go:117] "RemoveContainer" containerID="ff00e7152e69c4aeaaff4ebd02f8e9bc3011a8b0e33817b723307cd7fa5fe455" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.031022 4983 scope.go:117] "RemoveContainer" containerID="7455d52b296ac2dc05d5dba007a96face87721af18e58d348eedd55fbc4a2082" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.043406 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.043433 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.043443 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqln2\" (UniqueName: \"kubernetes.io/projected/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4-kube-api-access-bqln2\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.255694 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:13:11 crc kubenswrapper[4983]: I0316 00:13:11.259045 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-txzqn"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.107389 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" path="/var/lib/kubelet/pods/b6bd9bf5-fa59-4fef-9589-7b5865098bd2/volumes" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.108780 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" path="/var/lib/kubelet/pods/ca55ad69-3f41-4d0c-8f86-83a583ff6fe4/volumes" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.135881 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.136410 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kjc2w" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" containerID="cri-o://840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" gracePeriod=2 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.576909 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.734251 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.734467 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qx9g" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" containerID="cri-o://41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00" gracePeriod=2 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.762872 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") pod \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.763326 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") pod \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.763481 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") pod \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\" (UID: \"00a4a2a2-9263-4b76-8294-fa9c4d918fc7\") " Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.763629 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities" (OuterVolumeSpecName: "utilities") pod "00a4a2a2-9263-4b76-8294-fa9c4d918fc7" (UID: "00a4a2a2-9263-4b76-8294-fa9c4d918fc7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.764087 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.776993 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w" (OuterVolumeSpecName: "kube-api-access-r8x8w") pod "00a4a2a2-9263-4b76-8294-fa9c4d918fc7" (UID: "00a4a2a2-9263-4b76-8294-fa9c4d918fc7"). InnerVolumeSpecName "kube-api-access-r8x8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.788644 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00a4a2a2-9263-4b76-8294-fa9c4d918fc7" (UID: "00a4a2a2-9263-4b76-8294-fa9c4d918fc7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.866440 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8x8w\" (UniqueName: \"kubernetes.io/projected/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-kube-api-access-r8x8w\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.866490 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00a4a2a2-9263-4b76-8294-fa9c4d918fc7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.944918 4983 generic.go:334] "Generic (PLEG): container finished" podID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" exitCode=0 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.944999 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb"} Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.945031 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjc2w" event={"ID":"00a4a2a2-9263-4b76-8294-fa9c4d918fc7","Type":"ContainerDied","Data":"edfb4c106db9ff156e89258c7be736e143b651348ae2eece9c28a73c16f1a791"} Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.945051 4983 scope.go:117] "RemoveContainer" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.945162 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjc2w" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.949345 4983 generic.go:334] "Generic (PLEG): container finished" podID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerID="41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00" exitCode=0 Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.949391 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00"} Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.972981 4983 scope.go:117] "RemoveContainer" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.992360 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:13:12 crc kubenswrapper[4983]: I0316 00:13:12.997961 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjc2w"] Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.006271 4983 scope.go:117] "RemoveContainer" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.032668 4983 scope.go:117] "RemoveContainer" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" Mar 16 00:13:13 crc kubenswrapper[4983]: E0316 00:13:13.033030 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb\": container with ID starting with 840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb not found: ID does not exist" containerID="840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033056 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb"} err="failed to get container status \"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb\": rpc error: code = NotFound desc = could not find container \"840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb\": container with ID starting with 840f0f757c39c751affd54d3d63de31b49c374902f33d2b69cc7b7ae10bfd5bb not found: ID does not exist" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033075 4983 scope.go:117] "RemoveContainer" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" Mar 16 00:13:13 crc kubenswrapper[4983]: E0316 00:13:13.033884 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404\": container with ID starting with 86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404 not found: ID does not exist" containerID="86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033927 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404"} err="failed to get container status \"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404\": rpc error: code = NotFound desc = could not find container \"86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404\": container with ID starting with 86019c3ea522ecdaedfa5dbc5deeea3a626e3740041033fbdd7a120c6a0f7404 not found: ID does not exist" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.033957 4983 scope.go:117] "RemoveContainer" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" Mar 16 00:13:13 crc kubenswrapper[4983]: E0316 00:13:13.034265 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32\": container with ID starting with 27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32 not found: ID does not exist" containerID="27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.034286 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32"} err="failed to get container status \"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32\": rpc error: code = NotFound desc = could not find container \"27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32\": container with ID starting with 27f5e02ac92789d755943aa157fd3d940e22c0ccfc4f6071525ae9bf37261a32 not found: ID does not exist" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.140937 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.271737 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") pod \"7bc03354-3cba-40ac-a894-844d6ae1ee69\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.271834 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") pod \"7bc03354-3cba-40ac-a894-844d6ae1ee69\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.271941 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") pod \"7bc03354-3cba-40ac-a894-844d6ae1ee69\" (UID: \"7bc03354-3cba-40ac-a894-844d6ae1ee69\") " Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.272594 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities" (OuterVolumeSpecName: "utilities") pod "7bc03354-3cba-40ac-a894-844d6ae1ee69" (UID: "7bc03354-3cba-40ac-a894-844d6ae1ee69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.275063 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2" (OuterVolumeSpecName: "kube-api-access-x8nr2") pod "7bc03354-3cba-40ac-a894-844d6ae1ee69" (UID: "7bc03354-3cba-40ac-a894-844d6ae1ee69"). InnerVolumeSpecName "kube-api-access-x8nr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.374392 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8nr2\" (UniqueName: \"kubernetes.io/projected/7bc03354-3cba-40ac-a894-844d6ae1ee69-kube-api-access-x8nr2\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.374427 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.425721 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bc03354-3cba-40ac-a894-844d6ae1ee69" (UID: "7bc03354-3cba-40ac-a894-844d6ae1ee69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.474991 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bc03354-3cba-40ac-a894-844d6ae1ee69-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.957444 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qx9g" event={"ID":"7bc03354-3cba-40ac-a894-844d6ae1ee69","Type":"ContainerDied","Data":"66c02382f4884cf7432e8b1dd2d9aae721248d87c7cd3a1bce60e42991bb56c4"} Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.957836 4983 scope.go:117] "RemoveContainer" containerID="41a3bc85f34bbfc428451e248ff3adf551bec5d147a73350a6adc1cc78464c00" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.957490 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qx9g" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.980232 4983 scope.go:117] "RemoveContainer" containerID="5315d03c3a0c66cd9452cd1be2631735c8666c6ac21135b6c44ab5b65cd08883" Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.987286 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:13:13 crc kubenswrapper[4983]: I0316 00:13:13.997128 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qx9g"] Mar 16 00:13:14 crc kubenswrapper[4983]: I0316 00:13:14.013954 4983 scope.go:117] "RemoveContainer" containerID="b9712062cb37f4ba2339e9dc2def8ff36e2a54d5fce9ebcc83e68db1e8c9e216" Mar 16 00:13:14 crc kubenswrapper[4983]: I0316 00:13:14.101814 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" path="/var/lib/kubelet/pods/00a4a2a2-9263-4b76-8294-fa9c4d918fc7/volumes" Mar 16 00:13:14 crc kubenswrapper[4983]: I0316 00:13:14.102574 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" path="/var/lib/kubelet/pods/7bc03354-3cba-40ac-a894-844d6ae1ee69/volumes" Mar 16 00:13:16 crc kubenswrapper[4983]: I0316 00:13:16.567456 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" containerID="cri-o://5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4" gracePeriod=15 Mar 16 00:13:16 crc kubenswrapper[4983]: I0316 00:13:16.975954 4983 generic.go:334] "Generic (PLEG): container finished" podID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerID="5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4" exitCode=0 Mar 16 00:13:16 crc kubenswrapper[4983]: I0316 00:13:16.976069 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerDied","Data":"5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4"} Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.060402 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217025 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217092 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217202 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217268 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217341 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217413 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217462 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217515 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217550 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217588 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217633 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.217662 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") pod \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\" (UID: \"0fd829d1-ad38-407e-a576-43aa5a6ca8f2\") " Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.218623 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.219336 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.221107 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.221238 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.221258 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.224452 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.224896 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.225145 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.225564 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.226528 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.232225 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.233739 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx" (OuterVolumeSpecName: "kube-api-access-5xzhx") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "kube-api-access-5xzhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.232923 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.233553 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0fd829d1-ad38-407e-a576-43aa5a6ca8f2" (UID: "0fd829d1-ad38-407e-a576-43aa5a6ca8f2"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319305 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319341 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319352 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319364 4983 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319378 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319391 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319406 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319418 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319430 4983 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319441 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319454 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319465 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319477 4983 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.319489 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xzhx\" (UniqueName: \"kubernetes.io/projected/0fd829d1-ad38-407e-a576-43aa5a6ca8f2-kube-api-access-5xzhx\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.985698 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" event={"ID":"0fd829d1-ad38-407e-a576-43aa5a6ca8f2","Type":"ContainerDied","Data":"992aee5b0776d510c59718dbe65f51126e10a5ddde1021826a4cd33845179277"} Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.985800 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-df6gg" Mar 16 00:13:17 crc kubenswrapper[4983]: I0316 00:13:17.985855 4983 scope.go:117] "RemoveContainer" containerID="5ab76987f0d86f28eac9406e16f1acebdbf300a37f32a1aa45d218eb2af1f3e4" Mar 16 00:13:18 crc kubenswrapper[4983]: I0316 00:13:18.036524 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:13:18 crc kubenswrapper[4983]: I0316 00:13:18.039028 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-df6gg"] Mar 16 00:13:18 crc kubenswrapper[4983]: I0316 00:13:18.098672 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" path="/var/lib/kubelet/pods/0fd829d1-ad38-407e-a576-43aa5a6ca8f2/volumes" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.935609 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vr64c"] Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936053 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936064 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936072 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936078 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936085 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936090 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936104 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936110 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936118 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936124 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936133 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936139 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936147 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936153 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936160 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936165 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936173 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936178 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936188 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936193 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936202 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936208 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="extract-utilities" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936218 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936223 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: E0316 00:13:20.936230 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936235 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="extract-content" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936319 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a4a2a2-9263-4b76-8294-fa9c4d918fc7" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936330 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca55ad69-3f41-4d0c-8f86-83a583ff6fe4" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936341 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6bd9bf5-fa59-4fef-9589-7b5865098bd2" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936349 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd829d1-ad38-407e-a576-43aa5a6ca8f2" containerName="oauth-openshift" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936375 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bc03354-3cba-40ac-a894-844d6ae1ee69" containerName="registry-server" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.936760 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:20 crc kubenswrapper[4983]: I0316 00:13:20.952496 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vr64c"] Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062154 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pj9h\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-kube-api-access-4pj9h\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-tls\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062235 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75689cb1-d790-48a1-91b5-6880d37ecb86-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062299 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-trusted-ca\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062351 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75689cb1-d790-48a1-91b5-6880d37ecb86-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062392 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-bound-sa-token\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062443 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-certificates\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.062485 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.088902 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.163929 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-certificates\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164002 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pj9h\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-kube-api-access-4pj9h\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164026 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-tls\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164041 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75689cb1-d790-48a1-91b5-6880d37ecb86-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164056 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-trusted-ca\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164206 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75689cb1-d790-48a1-91b5-6880d37ecb86-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.164234 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-bound-sa-token\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.165623 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-certificates\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.165922 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/75689cb1-d790-48a1-91b5-6880d37ecb86-ca-trust-extracted\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.167309 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/75689cb1-d790-48a1-91b5-6880d37ecb86-trusted-ca\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.170009 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/75689cb1-d790-48a1-91b5-6880d37ecb86-installation-pull-secrets\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.171053 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-registry-tls\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.184478 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-bound-sa-token\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.184724 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pj9h\" (UniqueName: \"kubernetes.io/projected/75689cb1-d790-48a1-91b5-6880d37ecb86-kube-api-access-4pj9h\") pod \"image-registry-66df7c8f76-vr64c\" (UID: \"75689cb1-d790-48a1-91b5-6880d37ecb86\") " pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.317122 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:21 crc kubenswrapper[4983]: I0316 00:13:21.695652 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-vr64c"] Mar 16 00:13:21 crc kubenswrapper[4983]: W0316 00:13:21.706115 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75689cb1_d790_48a1_91b5_6880d37ecb86.slice/crio-45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441 WatchSource:0}: Error finding container 45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441: Status 404 returned error can't find the container with id 45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441 Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.027642 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" event={"ID":"75689cb1-d790-48a1-91b5-6880d37ecb86","Type":"ContainerStarted","Data":"e46ca17635d18ccc135e4e82d37d32003610abc69d68fd1fcef8227c4b1cd844"} Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.027716 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" event={"ID":"75689cb1-d790-48a1-91b5-6880d37ecb86","Type":"ContainerStarted","Data":"45c6882943a40a289a3173f68c712aed8808c792d721366601a6f8d66e4c0441"} Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.027971 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.058585 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" podStartSLOduration=2.058559523 podStartE2EDuration="2.058559523s" podCreationTimestamp="2026-03-16 00:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:22.052258118 +0000 UTC m=+410.652356588" watchObservedRunningTime="2026-03-16 00:13:22.058559523 +0000 UTC m=+410.658657983" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.730216 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5d79794f9d-7s5jx"] Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.731266 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.736190 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.736687 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.736736 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.738085 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.738423 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739061 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739247 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739743 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.739959 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.740461 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.740787 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.750835 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.751310 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.757595 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.764318 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.765618 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d79794f9d-7s5jx"] Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783680 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783768 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783842 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.783959 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784006 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784050 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784083 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-session\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784110 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784142 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5whj4\" (UniqueName: \"kubernetes.io/projected/061c8b83-1d07-4b74-9689-a86e3363a770-kube-api-access-5whj4\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784224 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-audit-policies\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784274 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/061c8b83-1d07-4b74-9689-a86e3363a770-audit-dir\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784299 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784337 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.784368 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885465 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885537 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/061c8b83-1d07-4b74-9689-a86e3363a770-audit-dir\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885592 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885624 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885682 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885712 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885789 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885811 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885832 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885882 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885906 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-session\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885952 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.885977 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5whj4\" (UniqueName: \"kubernetes.io/projected/061c8b83-1d07-4b74-9689-a86e3363a770-kube-api-access-5whj4\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.886037 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-audit-policies\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.887078 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/061c8b83-1d07-4b74-9689-a86e3363a770-audit-dir\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.887550 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.887868 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-service-ca\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.888864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-audit-policies\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.889065 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.893123 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-router-certs\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.893404 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.894023 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.894586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-error\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.895380 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-login\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.897478 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-session\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.904163 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.904228 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/061c8b83-1d07-4b74-9689-a86e3363a770-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:22 crc kubenswrapper[4983]: I0316 00:13:22.905849 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5whj4\" (UniqueName: \"kubernetes.io/projected/061c8b83-1d07-4b74-9689-a86e3363a770-kube-api-access-5whj4\") pod \"oauth-openshift-5d79794f9d-7s5jx\" (UID: \"061c8b83-1d07-4b74-9689-a86e3363a770\") " pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.055269 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.448541 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.448650 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:13:23 crc kubenswrapper[4983]: I0316 00:13:23.466981 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5d79794f9d-7s5jx"] Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.043823 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" event={"ID":"061c8b83-1d07-4b74-9689-a86e3363a770","Type":"ContainerStarted","Data":"31fe1c68d7249884a5bb82e1cd9a84ebe1594199f7c8385d5d8319c48f1031a1"} Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.044131 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" event={"ID":"061c8b83-1d07-4b74-9689-a86e3363a770","Type":"ContainerStarted","Data":"978b3636703efe358b96e42970e9c1b57c6101833ceb06764199ae65aaedb149"} Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.044542 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.082463 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" podStartSLOduration=33.082432313 podStartE2EDuration="33.082432313s" podCreationTimestamp="2026-03-16 00:12:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:24.07234536 +0000 UTC m=+412.672443800" watchObservedRunningTime="2026-03-16 00:13:24.082432313 +0000 UTC m=+412.682530783" Mar 16 00:13:24 crc kubenswrapper[4983]: I0316 00:13:24.306540 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5d79794f9d-7s5jx" Mar 16 00:13:25 crc kubenswrapper[4983]: I0316 00:13:25.983202 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:13:25 crc kubenswrapper[4983]: I0316 00:13:25.984503 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" containerID="cri-o://fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" gracePeriod=30 Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.500625 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641817 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641907 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641937 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.641962 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") pod \"17c999f7-aab6-48d2-afe8-2c317c1825f5\" (UID: \"17c999f7-aab6-48d2-afe8-2c317c1825f5\") " Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.642814 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config" (OuterVolumeSpecName: "config") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.643648 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca" (OuterVolumeSpecName: "client-ca") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.646271 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.646646 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r" (OuterVolumeSpecName: "kube-api-access-2pg6r") pod "17c999f7-aab6-48d2-afe8-2c317c1825f5" (UID: "17c999f7-aab6-48d2-afe8-2c317c1825f5"). InnerVolumeSpecName "kube-api-access-2pg6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743483 4983 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743518 4983 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17c999f7-aab6-48d2-afe8-2c317c1825f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743527 4983 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17c999f7-aab6-48d2-afe8-2c317c1825f5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:26 crc kubenswrapper[4983]: I0316 00:13:26.743536 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pg6r\" (UniqueName: \"kubernetes.io/projected/17c999f7-aab6-48d2-afe8-2c317c1825f5-kube-api-access-2pg6r\") on node \"crc\" DevicePath \"\"" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066071 4983 generic.go:334] "Generic (PLEG): container finished" podID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" exitCode=0 Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerDied","Data":"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885"} Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066183 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066203 4983 scope.go:117] "RemoveContainer" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.066188 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm" event={"ID":"17c999f7-aab6-48d2-afe8-2c317c1825f5","Type":"ContainerDied","Data":"876216aea4183e045dea0a00510451ec87862184e09e5c477a2acd8429042b4a"} Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.092243 4983 scope.go:117] "RemoveContainer" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" Mar 16 00:13:27 crc kubenswrapper[4983]: E0316 00:13:27.093285 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885\": container with ID starting with fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885 not found: ID does not exist" containerID="fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.093351 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885"} err="failed to get container status \"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885\": rpc error: code = NotFound desc = could not find container \"fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885\": container with ID starting with fc5d69b0b32ac3e50bb5dfa90d98a254ca322c56f9d01d7c2ce4a11fc8018885 not found: ID does not exist" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.106124 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.110235 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8697489c76-cnkxm"] Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.734575 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq"] Mar 16 00:13:27 crc kubenswrapper[4983]: E0316 00:13:27.734914 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.734939 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.735150 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" containerName="route-controller-manager" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.735911 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739471 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739491 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739522 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.739479 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.740131 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.740161 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.753581 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq"] Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858047 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-config\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858135 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-client-ca\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858192 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvtk8\" (UniqueName: \"kubernetes.io/projected/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-kube-api-access-nvtk8\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.858227 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-serving-cert\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-config\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960180 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-client-ca\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960243 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvtk8\" (UniqueName: \"kubernetes.io/projected/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-kube-api-access-nvtk8\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.960294 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-serving-cert\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.962041 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-client-ca\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.965979 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-serving-cert\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.966104 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-config\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:27 crc kubenswrapper[4983]: I0316 00:13:27.983072 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvtk8\" (UniqueName: \"kubernetes.io/projected/9278fbf1-bc49-4361-b1c8-4b63798e5fc7-kube-api-access-nvtk8\") pod \"route-controller-manager-574fdb9957-brmlq\" (UID: \"9278fbf1-bc49-4361-b1c8-4b63798e5fc7\") " pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:28 crc kubenswrapper[4983]: I0316 00:13:28.062108 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:28 crc kubenswrapper[4983]: I0316 00:13:28.104518 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17c999f7-aab6-48d2-afe8-2c317c1825f5" path="/var/lib/kubelet/pods/17c999f7-aab6-48d2-afe8-2c317c1825f5/volumes" Mar 16 00:13:28 crc kubenswrapper[4983]: I0316 00:13:28.435538 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq"] Mar 16 00:13:28 crc kubenswrapper[4983]: W0316 00:13:28.440249 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9278fbf1_bc49_4361_b1c8_4b63798e5fc7.slice/crio-51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195 WatchSource:0}: Error finding container 51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195: Status 404 returned error can't find the container with id 51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195 Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.077564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" event={"ID":"9278fbf1-bc49-4361-b1c8-4b63798e5fc7","Type":"ContainerStarted","Data":"206cee71b0e6278aac3686b3eb2ce2742fc538494d3d0c96acd65e3c8a7c0155"} Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.077957 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" event={"ID":"9278fbf1-bc49-4361-b1c8-4b63798e5fc7","Type":"ContainerStarted","Data":"51b1a655b22ec79d30073adfceb6012a3c86c1a7d2dae88fd50f188bf2c1e195"} Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.077982 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.082573 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" Mar 16 00:13:29 crc kubenswrapper[4983]: I0316 00:13:29.095161 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-574fdb9957-brmlq" podStartSLOduration=4.095141636 podStartE2EDuration="4.095141636s" podCreationTimestamp="2026-03-16 00:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:13:29.091365377 +0000 UTC m=+417.691463827" watchObservedRunningTime="2026-03-16 00:13:29.095141636 +0000 UTC m=+417.695240066" Mar 16 00:13:41 crc kubenswrapper[4983]: I0316 00:13:41.326046 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-vr64c" Mar 16 00:13:41 crc kubenswrapper[4983]: I0316 00:13:41.400642 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:13:53 crc kubenswrapper[4983]: I0316 00:13:53.448607 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:13:53 crc kubenswrapper[4983]: I0316 00:13:53.449445 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.145536 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.149362 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.151940 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.152285 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.152688 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.153569 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.294535 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"auto-csr-approver-29560334-5n4gc\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.396289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"auto-csr-approver-29560334-5n4gc\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.421155 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"auto-csr-approver-29560334-5n4gc\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:00 crc kubenswrapper[4983]: I0316 00:14:00.479435 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:01 crc kubenswrapper[4983]: I0316 00:14:00.940902 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:14:01 crc kubenswrapper[4983]: I0316 00:14:01.290324 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerStarted","Data":"85bcf12afe7ba41a760a19fd98d39c6447bcc6bc7f82af3fb2a49fc3e57ff35b"} Mar 16 00:14:02 crc kubenswrapper[4983]: I0316 00:14:02.295660 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerStarted","Data":"5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a"} Mar 16 00:14:03 crc kubenswrapper[4983]: I0316 00:14:03.307351 4983 generic.go:334] "Generic (PLEG): container finished" podID="272489bc-7bd4-4421-930d-150816da83b8" containerID="5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a" exitCode=0 Mar 16 00:14:03 crc kubenswrapper[4983]: I0316 00:14:03.308616 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerDied","Data":"5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a"} Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.562983 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.577583 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") pod \"272489bc-7bd4-4421-930d-150816da83b8\" (UID: \"272489bc-7bd4-4421-930d-150816da83b8\") " Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.621868 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c" (OuterVolumeSpecName: "kube-api-access-sbf4c") pod "272489bc-7bd4-4421-930d-150816da83b8" (UID: "272489bc-7bd4-4421-930d-150816da83b8"). InnerVolumeSpecName "kube-api-access-sbf4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:04 crc kubenswrapper[4983]: I0316 00:14:04.679004 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbf4c\" (UniqueName: \"kubernetes.io/projected/272489bc-7bd4-4421-930d-150816da83b8-kube-api-access-sbf4c\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.156696 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.160178 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560328-sngnj"] Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.321433 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" event={"ID":"272489bc-7bd4-4421-930d-150816da83b8","Type":"ContainerDied","Data":"85bcf12afe7ba41a760a19fd98d39c6447bcc6bc7f82af3fb2a49fc3e57ff35b"} Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.321480 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85bcf12afe7ba41a760a19fd98d39c6447bcc6bc7f82af3fb2a49fc3e57ff35b" Mar 16 00:14:05 crc kubenswrapper[4983]: I0316 00:14:05.321484 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560334-5n4gc" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.100175 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da42bf3-da76-4db7-9653-f2f08567084f" path="/var/lib/kubelet/pods/9da42bf3-da76-4db7-9653-f2f08567084f/volumes" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.435486 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" containerID="cri-o://b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" gracePeriod=30 Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.819353 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.910687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.910995 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911039 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911067 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911099 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911121 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911181 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.911235 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") pod \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\" (UID: \"0a099f86-8967-4361-bbbf-4dfa8385d2f2\") " Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.912632 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.912863 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.918165 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.918270 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.921784 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.922934 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s" (OuterVolumeSpecName: "kube-api-access-x9n5s") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "kube-api-access-x9n5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.929333 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 16 00:14:06 crc kubenswrapper[4983]: I0316 00:14:06.940429 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0a099f86-8967-4361-bbbf-4dfa8385d2f2" (UID: "0a099f86-8967-4361-bbbf-4dfa8385d2f2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.012997 4983 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013052 4983 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0a099f86-8967-4361-bbbf-4dfa8385d2f2-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013110 4983 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0a099f86-8967-4361-bbbf-4dfa8385d2f2-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013131 4983 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013150 4983 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013170 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9n5s\" (UniqueName: \"kubernetes.io/projected/0a099f86-8967-4361-bbbf-4dfa8385d2f2-kube-api-access-x9n5s\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.013189 4983 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0a099f86-8967-4361-bbbf-4dfa8385d2f2-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.175175 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.175468 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hsgsl" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" containerID="cri-o://ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.185327 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.185564 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vxnxc" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" containerID="cri-o://6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.193573 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.193807 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" containerID="cri-o://44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.201073 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.201539 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b68d7" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" containerID="cri-o://c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.210839 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.211070 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56c2t" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" containerID="cri-o://ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f" gracePeriod=30 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.231363 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pvjtd"] Mar 16 00:14:07 crc kubenswrapper[4983]: E0316 00:14:07.232083 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272489bc-7bd4-4421-930d-150816da83b8" containerName="oc" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232109 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="272489bc-7bd4-4421-930d-150816da83b8" containerName="oc" Mar 16 00:14:07 crc kubenswrapper[4983]: E0316 00:14:07.232142 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232151 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232599 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="272489bc-7bd4-4421-930d-150816da83b8" containerName="oc" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.232631 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerName="registry" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.233243 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.256443 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pvjtd"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.320689 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.320795 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwllv\" (UniqueName: \"kubernetes.io/projected/46c9f8c6-7d08-47e7-866d-7f359e8683be-kube-api-access-wwllv\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.320836 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.343823 4983 generic.go:334] "Generic (PLEG): container finished" podID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerID="ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.343907 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.353058 4983 generic.go:334] "Generic (PLEG): container finished" podID="87a722ee-1078-41fd-bd5e-96981b43652d" containerID="44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.353149 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerDied","Data":"44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355377 4983 generic.go:334] "Generic (PLEG): container finished" podID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355468 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerDied","Data":"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355486 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355502 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4sm6x" event={"ID":"0a099f86-8967-4361-bbbf-4dfa8385d2f2","Type":"ContainerDied","Data":"e0fb578aeb69cdf828d396d7abaed36aa77a72628836d8b9f23c76675c3ee11f"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.355526 4983 scope.go:117] "RemoveContainer" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.364472 4983 generic.go:334] "Generic (PLEG): container finished" podID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerID="6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.364543 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.369797 4983 generic.go:334] "Generic (PLEG): container finished" podID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerID="c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.369860 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.375564 4983 generic.go:334] "Generic (PLEG): container finished" podID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerID="ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f" exitCode=0 Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.375612 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f"} Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.422049 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.422132 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.422173 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwllv\" (UniqueName: \"kubernetes.io/projected/46c9f8c6-7d08-47e7-866d-7f359e8683be-kube-api-access-wwllv\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.423264 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.428378 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/46c9f8c6-7d08-47e7-866d-7f359e8683be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.438323 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwllv\" (UniqueName: \"kubernetes.io/projected/46c9f8c6-7d08-47e7-866d-7f359e8683be-kube-api-access-wwllv\") pod \"marketplace-operator-79b997595-pvjtd\" (UID: \"46c9f8c6-7d08-47e7-866d-7f359e8683be\") " pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.671638 4983 scope.go:117] "RemoveContainer" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" Mar 16 00:14:07 crc kubenswrapper[4983]: E0316 00:14:07.673189 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae\": container with ID starting with b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae not found: ID does not exist" containerID="b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.673245 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae"} err="failed to get container status \"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae\": rpc error: code = NotFound desc = could not find container \"b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae\": container with ID starting with b6ba7239344584339d54dcd3f9c448834389096877ed722764cd88318255abae not found: ID does not exist" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.676261 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.691818 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.710086 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.713418 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4sm6x"] Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.728602 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") pod \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.728686 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") pod \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.728791 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") pod \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\" (UID: \"f617dbbc-f757-49b9-b8c6-7d0c07cb197e\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.730052 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities" (OuterVolumeSpecName: "utilities") pod "f617dbbc-f757-49b9-b8c6-7d0c07cb197e" (UID: "f617dbbc-f757-49b9-b8c6-7d0c07cb197e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.730650 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.734036 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2" (OuterVolumeSpecName: "kube-api-access-xbbl2") pod "f617dbbc-f757-49b9-b8c6-7d0c07cb197e" (UID: "f617dbbc-f757-49b9-b8c6-7d0c07cb197e"). InnerVolumeSpecName "kube-api-access-xbbl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.788108 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.797359 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.825369 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.826809 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831278 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") pod \"87a722ee-1078-41fd-bd5e-96981b43652d\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831330 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") pod \"8fd3d4ca-4839-4327-8121-fe6ba21051da\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831396 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") pod \"87a722ee-1078-41fd-bd5e-96981b43652d\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831448 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") pod \"87a722ee-1078-41fd-bd5e-96981b43652d\" (UID: \"87a722ee-1078-41fd-bd5e-96981b43652d\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831499 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") pod \"8fd3d4ca-4839-4327-8121-fe6ba21051da\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831536 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") pod \"8fd3d4ca-4839-4327-8121-fe6ba21051da\" (UID: \"8fd3d4ca-4839-4327-8121-fe6ba21051da\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.831792 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbbl2\" (UniqueName: \"kubernetes.io/projected/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-kube-api-access-xbbl2\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.833343 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities" (OuterVolumeSpecName: "utilities") pod "8fd3d4ca-4839-4327-8121-fe6ba21051da" (UID: "8fd3d4ca-4839-4327-8121-fe6ba21051da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.834827 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "87a722ee-1078-41fd-bd5e-96981b43652d" (UID: "87a722ee-1078-41fd-bd5e-96981b43652d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.839300 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f" (OuterVolumeSpecName: "kube-api-access-l8w9f") pod "87a722ee-1078-41fd-bd5e-96981b43652d" (UID: "87a722ee-1078-41fd-bd5e-96981b43652d"). InnerVolumeSpecName "kube-api-access-l8w9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.841401 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "87a722ee-1078-41fd-bd5e-96981b43652d" (UID: "87a722ee-1078-41fd-bd5e-96981b43652d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.843386 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s" (OuterVolumeSpecName: "kube-api-access-cm28s") pod "8fd3d4ca-4839-4327-8121-fe6ba21051da" (UID: "8fd3d4ca-4839-4327-8121-fe6ba21051da"). InnerVolumeSpecName "kube-api-access-cm28s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.863069 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f617dbbc-f757-49b9-b8c6-7d0c07cb197e" (UID: "f617dbbc-f757-49b9-b8c6-7d0c07cb197e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.890638 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fd3d4ca-4839-4327-8121-fe6ba21051da" (UID: "8fd3d4ca-4839-4327-8121-fe6ba21051da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.935496 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") pod \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.935873 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") pod \"cbebf69d-773f-4829-a4ec-e443d52ef275\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.935987 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") pod \"cbebf69d-773f-4829-a4ec-e443d52ef275\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936070 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") pod \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936107 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") pod \"cbebf69d-773f-4829-a4ec-e443d52ef275\" (UID: \"cbebf69d-773f-4829-a4ec-e443d52ef275\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936127 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") pod \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\" (UID: \"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07\") " Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.936860 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities" (OuterVolumeSpecName: "utilities") pod "cbebf69d-773f-4829-a4ec-e443d52ef275" (UID: "cbebf69d-773f-4829-a4ec-e443d52ef275"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937193 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f617dbbc-f757-49b9-b8c6-7d0c07cb197e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937206 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937217 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937227 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937260 4983 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87a722ee-1078-41fd-bd5e-96981b43652d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937270 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8w9f\" (UniqueName: \"kubernetes.io/projected/87a722ee-1078-41fd-bd5e-96981b43652d-kube-api-access-l8w9f\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937279 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm28s\" (UniqueName: \"kubernetes.io/projected/8fd3d4ca-4839-4327-8121-fe6ba21051da-kube-api-access-cm28s\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937288 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fd3d4ca-4839-4327-8121-fe6ba21051da-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.937312 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities" (OuterVolumeSpecName: "utilities") pod "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" (UID: "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.940282 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49" (OuterVolumeSpecName: "kube-api-access-msk49") pod "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" (UID: "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07"). InnerVolumeSpecName "kube-api-access-msk49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.940454 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9" (OuterVolumeSpecName: "kube-api-access-6kmd9") pod "cbebf69d-773f-4829-a4ec-e443d52ef275" (UID: "cbebf69d-773f-4829-a4ec-e443d52ef275"). InnerVolumeSpecName "kube-api-access-6kmd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:14:07 crc kubenswrapper[4983]: I0316 00:14:07.965745 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbebf69d-773f-4829-a4ec-e443d52ef275" (UID: "cbebf69d-773f-4829-a4ec-e443d52ef275"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038307 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msk49\" (UniqueName: \"kubernetes.io/projected/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-kube-api-access-msk49\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038371 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038391 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbebf69d-773f-4829-a4ec-e443d52ef275-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.038408 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kmd9\" (UniqueName: \"kubernetes.io/projected/cbebf69d-773f-4829-a4ec-e443d52ef275-kube-api-access-6kmd9\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.054483 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" (UID: "8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.100223 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a099f86-8967-4361-bbbf-4dfa8385d2f2" path="/var/lib/kubelet/pods/0a099f86-8967-4361-bbbf-4dfa8385d2f2/volumes" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.139888 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.161255 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-pvjtd"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.382984 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56c2t" event={"ID":"8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07","Type":"ContainerDied","Data":"839c30c9cbe107a7c9f0dd7cc6175826e37c3a950a4d5a9be034e934974f0bc3"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.383036 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56c2t" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.383042 4983 scope.go:117] "RemoveContainer" containerID="ebb5761c41d710f62351daf37faae0c364dec6494e019b8e9b1984f3f34d560f" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.386137 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hsgsl" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.385940 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hsgsl" event={"ID":"8fd3d4ca-4839-4327-8121-fe6ba21051da","Type":"ContainerDied","Data":"de21ac29d1b3f85746eecc6275790d886e43e62e160f35ab6e888afb27d08a5c"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.387602 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" event={"ID":"87a722ee-1078-41fd-bd5e-96981b43652d","Type":"ContainerDied","Data":"1965cf54da33760615e034ca9db488c5481e59caf0aa16831ccaefaf972dbc39"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.387638 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-tj49l" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.388965 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" event={"ID":"46c9f8c6-7d08-47e7-866d-7f359e8683be","Type":"ContainerStarted","Data":"04c976cb885ffa3503044e3e837dd223eba7c9aa0c7e27dd3416f595aad5a275"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.388994 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" event={"ID":"46c9f8c6-7d08-47e7-866d-7f359e8683be","Type":"ContainerStarted","Data":"ece972a7fbeda3de85367618c930e653ffab76f99d167fdc32f7a49ed0000814"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.389574 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.391145 4983 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-pvjtd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" start-of-body= Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.391201 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" podUID="46c9f8c6-7d08-47e7-866d-7f359e8683be" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.395020 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vxnxc" event={"ID":"f617dbbc-f757-49b9-b8c6-7d0c07cb197e","Type":"ContainerDied","Data":"560de43e4286295f7d460584c287dcc2fb86a8274cec5a292335c9438faa954b"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.395176 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vxnxc" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.397994 4983 scope.go:117] "RemoveContainer" containerID="4fd735d9c2a8af79e35b41af9d3f84d5c4faeb3f496099e9f47662ac9f90becf" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.400339 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b68d7" event={"ID":"cbebf69d-773f-4829-a4ec-e443d52ef275","Type":"ContainerDied","Data":"5fbb0356673aa199b061055bf122df8d3c4f8bc1dc9d0dbf904e99d7873ede45"} Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.400380 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b68d7" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.409275 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.413991 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56c2t"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.419524 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.423236 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hsgsl"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.430064 4983 scope.go:117] "RemoveContainer" containerID="0601a98e47222baf45860438cfc29d0447fa64cf46cd7bead9a6ef97f07beb9c" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.434049 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.439575 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-tj49l"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.444666 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.448081 4983 scope.go:117] "RemoveContainer" containerID="ac7cc066be48efae22a5403ae6443b38e54eef1955947eb62836ee914ccddbe0" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.451547 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b68d7"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.457990 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" podStartSLOduration=1.457972547 podStartE2EDuration="1.457972547s" podCreationTimestamp="2026-03-16 00:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:14:08.45550802 +0000 UTC m=+457.055606450" watchObservedRunningTime="2026-03-16 00:14:08.457972547 +0000 UTC m=+457.058070987" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.470241 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.474377 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vxnxc"] Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.476859 4983 scope.go:117] "RemoveContainer" containerID="de0cee5fa65ae8acc06500ed4f7bfd1b7fc45fe51327cba7b49afb9439e0134f" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.492909 4983 scope.go:117] "RemoveContainer" containerID="2c8a01779fdf7320586832f975808a3323314fc1dee647ee11f25e6ca498d9a4" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.505285 4983 scope.go:117] "RemoveContainer" containerID="44c3726454774541747806cc8af6b91f789705e9d629f767cefb9962a9601f36" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.517427 4983 scope.go:117] "RemoveContainer" containerID="6c4af783e4992498667061cde045dc4adfbc3a2bb0d78f02971a5d9ab47a3c4e" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.530418 4983 scope.go:117] "RemoveContainer" containerID="210bd7f5ab48e451b18cd186b0e612a0157714bee428a4d39d25cdd92c0f3eb0" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.543495 4983 scope.go:117] "RemoveContainer" containerID="1fc80a9e4fb01c05cb775f45190ece9037ca337a03452dd8abf5a08dd242d1da" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.559971 4983 scope.go:117] "RemoveContainer" containerID="c7ebe52d3884cc61ad7b5da10a6ceb1f5607237405cd86dac48708bdae4fcb7d" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.576654 4983 scope.go:117] "RemoveContainer" containerID="b6acaa7dffa774e191a9bf342869bf819b4d039ee2bd145b14e03704f80e4abc" Mar 16 00:14:08 crc kubenswrapper[4983]: I0316 00:14:08.597845 4983 scope.go:117] "RemoveContainer" containerID="b832baa9ad863d92bef0f4bd68918c75a656cd7a0c7e14efd5e15110ac3d6de8" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.192913 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193381 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193394 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193406 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193412 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193421 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193427 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193440 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193445 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193452 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193458 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193466 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193473 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193482 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193487 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193494 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193500 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193507 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193513 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193521 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193526 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193533 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193539 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="extract-utilities" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193546 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193551 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="extract-content" Mar 16 00:14:09 crc kubenswrapper[4983]: E0316 00:14:09.193560 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193567 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193660 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193681 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193691 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193701 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" containerName="marketplace-operator" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.193709 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" containerName="registry-server" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.194557 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.198461 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.218082 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.255315 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.255380 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.255398 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.356782 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.356841 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.356913 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.357454 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.357523 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.373969 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"redhat-marketplace-hgt5w\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.412330 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-pvjtd" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.511249 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.785972 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rxmlr"] Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.786844 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.794326 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.800796 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxmlr"] Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.863947 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-utilities\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.863987 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-catalog-content\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.864021 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxvd\" (UniqueName: \"kubernetes.io/projected/b4e15b89-9659-49da-bccb-c826ebceeb93-kube-api-access-9cxvd\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.938928 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:14:09 crc kubenswrapper[4983]: W0316 00:14:09.946818 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cf6a4e_082d_473f_8640_b1eb9b6591d2.slice/crio-3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541 WatchSource:0}: Error finding container 3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541: Status 404 returned error can't find the container with id 3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541 Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.965658 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxvd\" (UniqueName: \"kubernetes.io/projected/b4e15b89-9659-49da-bccb-c826ebceeb93-kube-api-access-9cxvd\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.965775 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-utilities\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.965791 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-catalog-content\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.966290 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-catalog-content\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.966777 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4e15b89-9659-49da-bccb-c826ebceeb93-utilities\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:09 crc kubenswrapper[4983]: I0316 00:14:09.983057 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxvd\" (UniqueName: \"kubernetes.io/projected/b4e15b89-9659-49da-bccb-c826ebceeb93-kube-api-access-9cxvd\") pod \"certified-operators-rxmlr\" (UID: \"b4e15b89-9659-49da-bccb-c826ebceeb93\") " pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.101104 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a722ee-1078-41fd-bd5e-96981b43652d" path="/var/lib/kubelet/pods/87a722ee-1078-41fd-bd5e-96981b43652d/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.102421 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07" path="/var/lib/kubelet/pods/8c698a1c-1aa2-4fd5-8afb-7e9742eb9a07/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.103523 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd3d4ca-4839-4327-8121-fe6ba21051da" path="/var/lib/kubelet/pods/8fd3d4ca-4839-4327-8121-fe6ba21051da/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.105075 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbebf69d-773f-4829-a4ec-e443d52ef275" path="/var/lib/kubelet/pods/cbebf69d-773f-4829-a4ec-e443d52ef275/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.105942 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f617dbbc-f757-49b9-b8c6-7d0c07cb197e" path="/var/lib/kubelet/pods/f617dbbc-f757-49b9-b8c6-7d0c07cb197e/volumes" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.117359 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.415657 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" exitCode=0 Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.415782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50"} Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.415828 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerStarted","Data":"3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541"} Mar 16 00:14:10 crc kubenswrapper[4983]: I0316 00:14:10.487385 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxmlr"] Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.421962 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4e15b89-9659-49da-bccb-c826ebceeb93" containerID="1ed184c41fbbe0cfdd26246e20d2e13e062bc90755594cdea61691aeae60a359" exitCode=0 Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.422061 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerDied","Data":"1ed184c41fbbe0cfdd26246e20d2e13e062bc90755594cdea61691aeae60a359"} Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.422336 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerStarted","Data":"59ad535c85447fad3a3389620dc6bbddc089f8ff0f21ad7bbae0c5487b26da3b"} Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.424686 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" exitCode=0 Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.424742 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d"} Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.590629 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hjdk"] Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.593434 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.595995 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.596337 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hjdk"] Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.686946 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-catalog-content\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.687004 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdrnf\" (UniqueName: \"kubernetes.io/projected/628d0b6e-5772-4af2-aa28-28cc15bd5d60-kube-api-access-kdrnf\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.687225 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-utilities\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.788599 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-utilities\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.788689 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-catalog-content\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.788719 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdrnf\" (UniqueName: \"kubernetes.io/projected/628d0b6e-5772-4af2-aa28-28cc15bd5d60-kube-api-access-kdrnf\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.789077 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-utilities\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.789157 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/628d0b6e-5772-4af2-aa28-28cc15bd5d60-catalog-content\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.807622 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdrnf\" (UniqueName: \"kubernetes.io/projected/628d0b6e-5772-4af2-aa28-28cc15bd5d60-kube-api-access-kdrnf\") pod \"redhat-operators-8hjdk\" (UID: \"628d0b6e-5772-4af2-aa28-28cc15bd5d60\") " pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:11 crc kubenswrapper[4983]: I0316 00:14:11.912855 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.202698 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-95rsh"] Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.204162 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.207654 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.209357 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95rsh"] Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.307893 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-utilities\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.307943 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-catalog-content\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.308094 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxbg\" (UniqueName: \"kubernetes.io/projected/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-kube-api-access-6vxbg\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.356049 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hjdk"] Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.409455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxbg\" (UniqueName: \"kubernetes.io/projected/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-kube-api-access-6vxbg\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.409510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-utilities\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.409545 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-catalog-content\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.410040 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-catalog-content\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.410093 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-utilities\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.426862 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxbg\" (UniqueName: \"kubernetes.io/projected/4deeaa90-9b0b-47cb-a8bf-4b2524a736a8-kube-api-access-6vxbg\") pod \"community-operators-95rsh\" (UID: \"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8\") " pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.433983 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4e15b89-9659-49da-bccb-c826ebceeb93" containerID="5760f0b4afd0b8ec7ad7ef8450f92118348dc093bd56a33aab992dad1ec1b8b1" exitCode=0 Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.434044 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerDied","Data":"5760f0b4afd0b8ec7ad7ef8450f92118348dc093bd56a33aab992dad1ec1b8b1"} Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.438099 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerStarted","Data":"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010"} Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.439224 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerStarted","Data":"952ad432709f7e1f20c8ed834fb41075eb679c955619d8842a55ef23f4eb92d8"} Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.519700 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.909912 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hgt5w" podStartSLOduration=2.446534504 podStartE2EDuration="3.909897179s" podCreationTimestamp="2026-03-16 00:14:09 +0000 UTC" firstStartedPulling="2026-03-16 00:14:10.417021644 +0000 UTC m=+459.017120074" lastFinishedPulling="2026-03-16 00:14:11.880384319 +0000 UTC m=+460.480482749" observedRunningTime="2026-03-16 00:14:12.475160412 +0000 UTC m=+461.075258862" watchObservedRunningTime="2026-03-16 00:14:12.909897179 +0000 UTC m=+461.509995609" Mar 16 00:14:12 crc kubenswrapper[4983]: I0316 00:14:12.912065 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-95rsh"] Mar 16 00:14:12 crc kubenswrapper[4983]: W0316 00:14:12.916911 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4deeaa90_9b0b_47cb_a8bf_4b2524a736a8.slice/crio-86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840 WatchSource:0}: Error finding container 86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840: Status 404 returned error can't find the container with id 86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840 Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.446011 4983 generic.go:334] "Generic (PLEG): container finished" podID="628d0b6e-5772-4af2-aa28-28cc15bd5d60" containerID="03812425a0066a0cb8753010a44bc79a1afe1757459aafa6771375ce0923d821" exitCode=0 Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.446107 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerDied","Data":"03812425a0066a0cb8753010a44bc79a1afe1757459aafa6771375ce0923d821"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.451639 4983 generic.go:334] "Generic (PLEG): container finished" podID="4deeaa90-9b0b-47cb-a8bf-4b2524a736a8" containerID="103a145016d5f85202efc125ca9860d61c1a070c8c81a59e7c305c82b79be272" exitCode=0 Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.451709 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerDied","Data":"103a145016d5f85202efc125ca9860d61c1a070c8c81a59e7c305c82b79be272"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.451731 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerStarted","Data":"86dd4e8c5f987a5e21ab707ebc1543aabcbdb1e0c21e81b09fab6bfb600f6840"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.460916 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxmlr" event={"ID":"b4e15b89-9659-49da-bccb-c826ebceeb93","Type":"ContainerStarted","Data":"819e219f8fa52b7d9a4b13b5f4a608060200d7a98f5caf2d3bc3fc96d9268e66"} Mar 16 00:14:13 crc kubenswrapper[4983]: I0316 00:14:13.530893 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rxmlr" podStartSLOduration=3.087410551 podStartE2EDuration="4.530875975s" podCreationTimestamp="2026-03-16 00:14:09 +0000 UTC" firstStartedPulling="2026-03-16 00:14:11.423768807 +0000 UTC m=+460.023867237" lastFinishedPulling="2026-03-16 00:14:12.867234231 +0000 UTC m=+461.467332661" observedRunningTime="2026-03-16 00:14:13.496046639 +0000 UTC m=+462.096145089" watchObservedRunningTime="2026-03-16 00:14:13.530875975 +0000 UTC m=+462.130974405" Mar 16 00:14:14 crc kubenswrapper[4983]: I0316 00:14:14.467312 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerStarted","Data":"b826077cc335e020e32973b7b2699a884131448fddff91106fcd824b02c99b9b"} Mar 16 00:14:14 crc kubenswrapper[4983]: I0316 00:14:14.468971 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerStarted","Data":"c368f85fd8ac6934d02456de71a6a8584c04ea955f47fac632bb3564121c63b3"} Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.478083 4983 generic.go:334] "Generic (PLEG): container finished" podID="628d0b6e-5772-4af2-aa28-28cc15bd5d60" containerID="b826077cc335e020e32973b7b2699a884131448fddff91106fcd824b02c99b9b" exitCode=0 Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.478157 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerDied","Data":"b826077cc335e020e32973b7b2699a884131448fddff91106fcd824b02c99b9b"} Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.480212 4983 generic.go:334] "Generic (PLEG): container finished" podID="4deeaa90-9b0b-47cb-a8bf-4b2524a736a8" containerID="c368f85fd8ac6934d02456de71a6a8584c04ea955f47fac632bb3564121c63b3" exitCode=0 Mar 16 00:14:15 crc kubenswrapper[4983]: I0316 00:14:15.480237 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerDied","Data":"c368f85fd8ac6934d02456de71a6a8584c04ea955f47fac632bb3564121c63b3"} Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.487613 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-95rsh" event={"ID":"4deeaa90-9b0b-47cb-a8bf-4b2524a736a8","Type":"ContainerStarted","Data":"b8f171ee59fb652ce06fb93a8fbdcf904f55f167d019cfec9066291fccdf630d"} Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.489608 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hjdk" event={"ID":"628d0b6e-5772-4af2-aa28-28cc15bd5d60","Type":"ContainerStarted","Data":"7e889058e0047e4b7eba362a74c3c0fc59c3cc9c26af5e15dc1668da9a757f57"} Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.506795 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-95rsh" podStartSLOduration=2.081783237 podStartE2EDuration="4.50677788s" podCreationTimestamp="2026-03-16 00:14:12 +0000 UTC" firstStartedPulling="2026-03-16 00:14:13.465007216 +0000 UTC m=+462.065105656" lastFinishedPulling="2026-03-16 00:14:15.890001869 +0000 UTC m=+464.490100299" observedRunningTime="2026-03-16 00:14:16.506387509 +0000 UTC m=+465.106485949" watchObservedRunningTime="2026-03-16 00:14:16.50677788 +0000 UTC m=+465.106876310" Mar 16 00:14:16 crc kubenswrapper[4983]: I0316 00:14:16.527627 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hjdk" podStartSLOduration=3.095277733 podStartE2EDuration="5.527611085s" podCreationTimestamp="2026-03-16 00:14:11 +0000 UTC" firstStartedPulling="2026-03-16 00:14:13.448616781 +0000 UTC m=+462.048715211" lastFinishedPulling="2026-03-16 00:14:15.880950133 +0000 UTC m=+464.481048563" observedRunningTime="2026-03-16 00:14:16.526875175 +0000 UTC m=+465.126973605" watchObservedRunningTime="2026-03-16 00:14:16.527611085 +0000 UTC m=+465.127709515" Mar 16 00:14:19 crc kubenswrapper[4983]: I0316 00:14:19.511845 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:19 crc kubenswrapper[4983]: I0316 00:14:19.527782 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:19 crc kubenswrapper[4983]: I0316 00:14:19.573491 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.118233 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.118320 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.158799 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.548140 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rxmlr" Mar 16 00:14:20 crc kubenswrapper[4983]: I0316 00:14:20.560392 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:14:21 crc kubenswrapper[4983]: I0316 00:14:21.913086 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:21 crc kubenswrapper[4983]: I0316 00:14:21.913148 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.520231 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.520273 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.566129 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:22 crc kubenswrapper[4983]: I0316 00:14:22.957866 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8hjdk" podUID="628d0b6e-5772-4af2-aa28-28cc15bd5d60" containerName="registry-server" probeResult="failure" output=< Mar 16 00:14:22 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:14:22 crc kubenswrapper[4983]: > Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.448040 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449198 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449296 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449742 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.449826 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c" gracePeriod=600 Mar 16 00:14:23 crc kubenswrapper[4983]: I0316 00:14:23.579811 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-95rsh" Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542432 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c" exitCode=0 Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542519 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c"} Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542710 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f"} Mar 16 00:14:24 crc kubenswrapper[4983]: I0316 00:14:24.542738 4983 scope.go:117] "RemoveContainer" containerID="25383bb27840763e8b89264f9c06cefbfc745b728eb234fac9fd4119115a2383" Mar 16 00:14:31 crc kubenswrapper[4983]: I0316 00:14:31.977148 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:14:32 crc kubenswrapper[4983]: I0316 00:14:32.041028 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hjdk" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.140180 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb"] Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.143270 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.146194 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.146504 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.151218 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb"] Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.299474 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.299937 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.300015 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.401028 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.401075 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.401113 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.402565 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.408487 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.417741 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"collect-profiles-29560335-q8csb\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.465564 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:00 crc kubenswrapper[4983]: I0316 00:15:00.885976 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb"] Mar 16 00:15:01 crc kubenswrapper[4983]: I0316 00:15:01.763435 4983 generic.go:334] "Generic (PLEG): container finished" podID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerID="def262889d3b7fe5b27fe4d3deb37511089ea35b2bea84f3d8f7a004f334c93b" exitCode=0 Mar 16 00:15:01 crc kubenswrapper[4983]: I0316 00:15:01.763824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" event={"ID":"1ff90261-e4e9-4ff3-86a2-6a0274e9454e","Type":"ContainerDied","Data":"def262889d3b7fe5b27fe4d3deb37511089ea35b2bea84f3d8f7a004f334c93b"} Mar 16 00:15:01 crc kubenswrapper[4983]: I0316 00:15:01.763865 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" event={"ID":"1ff90261-e4e9-4ff3-86a2-6a0274e9454e","Type":"ContainerStarted","Data":"32ccd45dbf3d3c04feb7c91ef200fb408fbc1cead4c28f3dd2087b6a592da66f"} Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.032855 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.132697 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") pod \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.132805 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") pod \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.132849 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") pod \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\" (UID: \"1ff90261-e4e9-4ff3-86a2-6a0274e9454e\") " Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.133326 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ff90261-e4e9-4ff3-86a2-6a0274e9454e" (UID: "1ff90261-e4e9-4ff3-86a2-6a0274e9454e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.138868 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ff90261-e4e9-4ff3-86a2-6a0274e9454e" (UID: "1ff90261-e4e9-4ff3-86a2-6a0274e9454e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.140065 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85" (OuterVolumeSpecName: "kube-api-access-58p85") pod "1ff90261-e4e9-4ff3-86a2-6a0274e9454e" (UID: "1ff90261-e4e9-4ff3-86a2-6a0274e9454e"). InnerVolumeSpecName "kube-api-access-58p85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.234430 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58p85\" (UniqueName: \"kubernetes.io/projected/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-kube-api-access-58p85\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.234471 4983 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.234490 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ff90261-e4e9-4ff3-86a2-6a0274e9454e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.775538 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" event={"ID":"1ff90261-e4e9-4ff3-86a2-6a0274e9454e","Type":"ContainerDied","Data":"32ccd45dbf3d3c04feb7c91ef200fb408fbc1cead4c28f3dd2087b6a592da66f"} Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.775575 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560335-q8csb" Mar 16 00:15:03 crc kubenswrapper[4983]: I0316 00:15:03.775579 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ccd45dbf3d3c04feb7c91ef200fb408fbc1cead4c28f3dd2087b6a592da66f" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.151042 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:16:00 crc kubenswrapper[4983]: E0316 00:16:00.152025 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerName="collect-profiles" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.152047 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerName="collect-profiles" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.152252 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff90261-e4e9-4ff3-86a2-6a0274e9454e" containerName="collect-profiles" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.152867 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.155460 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.158149 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.158274 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.159880 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.275152 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"auto-csr-approver-29560336-6d4qf\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.375924 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"auto-csr-approver-29560336-6d4qf\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.400607 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"auto-csr-approver-29560336-6d4qf\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.514384 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.714896 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:16:00 crc kubenswrapper[4983]: I0316 00:16:00.723493 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:16:01 crc kubenswrapper[4983]: I0316 00:16:01.157499 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerStarted","Data":"62abfec3868b503dddcadd449b49bba04126f6906e655f7ab31e9614fffc7705"} Mar 16 00:16:02 crc kubenswrapper[4983]: I0316 00:16:02.168201 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerStarted","Data":"a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19"} Mar 16 00:16:02 crc kubenswrapper[4983]: I0316 00:16:02.189438 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" podStartSLOduration=1.166087173 podStartE2EDuration="2.189416839s" podCreationTimestamp="2026-03-16 00:16:00 +0000 UTC" firstStartedPulling="2026-03-16 00:16:00.723258025 +0000 UTC m=+569.323356455" lastFinishedPulling="2026-03-16 00:16:01.746587691 +0000 UTC m=+570.346686121" observedRunningTime="2026-03-16 00:16:02.189235764 +0000 UTC m=+570.789334194" watchObservedRunningTime="2026-03-16 00:16:02.189416839 +0000 UTC m=+570.789515269" Mar 16 00:16:03 crc kubenswrapper[4983]: I0316 00:16:03.174447 4983 generic.go:334] "Generic (PLEG): container finished" podID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerID="a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19" exitCode=0 Mar 16 00:16:03 crc kubenswrapper[4983]: I0316 00:16:03.174497 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerDied","Data":"a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19"} Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.384469 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.425729 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") pod \"b56bb064-30c4-4aaf-a4d2-c81006425b62\" (UID: \"b56bb064-30c4-4aaf-a4d2-c81006425b62\") " Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.429827 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc" (OuterVolumeSpecName: "kube-api-access-vrrqc") pod "b56bb064-30c4-4aaf-a4d2-c81006425b62" (UID: "b56bb064-30c4-4aaf-a4d2-c81006425b62"). InnerVolumeSpecName "kube-api-access-vrrqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:16:04 crc kubenswrapper[4983]: I0316 00:16:04.527242 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrrqc\" (UniqueName: \"kubernetes.io/projected/b56bb064-30c4-4aaf-a4d2-c81006425b62-kube-api-access-vrrqc\") on node \"crc\" DevicePath \"\"" Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.179064 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.185526 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560330-65dr5"] Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.187142 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" event={"ID":"b56bb064-30c4-4aaf-a4d2-c81006425b62","Type":"ContainerDied","Data":"62abfec3868b503dddcadd449b49bba04126f6906e655f7ab31e9614fffc7705"} Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.187177 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62abfec3868b503dddcadd449b49bba04126f6906e655f7ab31e9614fffc7705" Mar 16 00:16:05 crc kubenswrapper[4983]: I0316 00:16:05.187185 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560336-6d4qf" Mar 16 00:16:06 crc kubenswrapper[4983]: I0316 00:16:06.101564 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39b8480-5521-4ff7-b6ec-4f67009b1f5c" path="/var/lib/kubelet/pods/c39b8480-5521-4ff7-b6ec-4f67009b1f5c/volumes" Mar 16 00:16:23 crc kubenswrapper[4983]: I0316 00:16:23.448701 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:16:23 crc kubenswrapper[4983]: I0316 00:16:23.449279 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:16:53 crc kubenswrapper[4983]: I0316 00:16:53.448526 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:16:53 crc kubenswrapper[4983]: I0316 00:16:53.449112 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.447901 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448391 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448431 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448942 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.448990 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f" gracePeriod=600 Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.680674 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f" exitCode=0 Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.680736 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f"} Mar 16 00:17:23 crc kubenswrapper[4983]: I0316 00:17:23.680956 4983 scope.go:117] "RemoveContainer" containerID="a285b65caa99c1c0ba0c4deb9dc06267b724d77153088ef275808d94e8acc41c" Mar 16 00:17:24 crc kubenswrapper[4983]: I0316 00:17:24.692160 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18"} Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.137787 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:18:00 crc kubenswrapper[4983]: E0316 00:18:00.138713 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.138737 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.138956 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" containerName="oc" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.139556 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.142135 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.143045 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.143952 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.144098 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.271451 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"auto-csr-approver-29560338-2jkpl\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.372471 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"auto-csr-approver-29560338-2jkpl\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.403968 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"auto-csr-approver-29560338-2jkpl\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.468842 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.690153 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:18:00 crc kubenswrapper[4983]: I0316 00:18:00.911828 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" event={"ID":"1c6e333f-fadd-4c92-8db1-b9a923850fa0","Type":"ContainerStarted","Data":"a7cd1304998263bded23ae2ca04dc03222cab0d730e80dcbefaa82e2d971d2df"} Mar 16 00:18:02 crc kubenswrapper[4983]: I0316 00:18:02.930442 4983 generic.go:334] "Generic (PLEG): container finished" podID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerID="e40b8ff2ea2fe096fb51ca5ef76f5eab03f687249bde3326f40974dcfd1c4938" exitCode=0 Mar 16 00:18:02 crc kubenswrapper[4983]: I0316 00:18:02.930532 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" event={"ID":"1c6e333f-fadd-4c92-8db1-b9a923850fa0","Type":"ContainerDied","Data":"e40b8ff2ea2fe096fb51ca5ef76f5eab03f687249bde3326f40974dcfd1c4938"} Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.116589 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.227786 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") pod \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\" (UID: \"1c6e333f-fadd-4c92-8db1-b9a923850fa0\") " Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.240321 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld" (OuterVolumeSpecName: "kube-api-access-jdwld") pod "1c6e333f-fadd-4c92-8db1-b9a923850fa0" (UID: "1c6e333f-fadd-4c92-8db1-b9a923850fa0"). InnerVolumeSpecName "kube-api-access-jdwld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.329656 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdwld\" (UniqueName: \"kubernetes.io/projected/1c6e333f-fadd-4c92-8db1-b9a923850fa0-kube-api-access-jdwld\") on node \"crc\" DevicePath \"\"" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.942688 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" event={"ID":"1c6e333f-fadd-4c92-8db1-b9a923850fa0","Type":"ContainerDied","Data":"a7cd1304998263bded23ae2ca04dc03222cab0d730e80dcbefaa82e2d971d2df"} Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.942744 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7cd1304998263bded23ae2ca04dc03222cab0d730e80dcbefaa82e2d971d2df" Mar 16 00:18:04 crc kubenswrapper[4983]: I0316 00:18:04.942879 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560338-2jkpl" Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.175001 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.182531 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560332-pflh5"] Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.501458 4983 scope.go:117] "RemoveContainer" containerID="76d2b798a64d4809150e865ba49cceb6346042cb22c2796d78469f6cd57fde6c" Mar 16 00:18:05 crc kubenswrapper[4983]: I0316 00:18:05.530970 4983 scope.go:117] "RemoveContainer" containerID="f1d9cd29662f3f229511dac637df41ff7b782921910c342dbfa3015d6466b383" Mar 16 00:18:06 crc kubenswrapper[4983]: I0316 00:18:06.104802 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d9cc10-08aa-485e-a7cd-305a3e316c39" path="/var/lib/kubelet/pods/90d9cc10-08aa-485e-a7cd-305a3e316c39/volumes" Mar 16 00:19:05 crc kubenswrapper[4983]: I0316 00:19:05.824364 4983 scope.go:117] "RemoveContainer" containerID="0e3f6e1e6221d6bd922f567a1feb21e97e8062170d3d8a1f33f38076de2dd3b8" Mar 16 00:19:05 crc kubenswrapper[4983]: I0316 00:19:05.903158 4983 scope.go:117] "RemoveContainer" containerID="81ae8785e149353406399189ed21a1fda919c310e54aefe671726be28185c2ae" Mar 16 00:19:23 crc kubenswrapper[4983]: I0316 00:19:23.447948 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:19:23 crc kubenswrapper[4983]: I0316 00:19:23.450061 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.442866 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wsfb4"] Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444220 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" containerID="cri-o://903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444308 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444308 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" containerID="cri-o://cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444600 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" containerID="cri-o://a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444618 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" containerID="cri-o://15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.444390 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" containerID="cri-o://f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.445111 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" containerID="cri-o://f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.482040 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.482040 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.486145 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.488033 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.488111 4983 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.489940 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.492936 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" containerID="cri-o://7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" gracePeriod=30 Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.493030 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.493117 4983 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.730593 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.732696 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-acl-logging/0.log" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.733171 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-controller/0.log" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.733573 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785646 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b29wv"] Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785889 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785909 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785924 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785932 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785943 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785951 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785962 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerName="oc" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785970 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerName="oc" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785982 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.785990 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.785999 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786006 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786015 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786021 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786033 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786040 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786051 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kubecfg-setup" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786058 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kubecfg-setup" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786070 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786078 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786090 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786098 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786109 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786116 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786223 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786235 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786247 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786255 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-ovn-metrics" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786265 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="northd" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786274 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="nbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786283 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="sbdb" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786291 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786299 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="kube-rbac-proxy-node" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786315 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovn-acl-logging" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786324 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" containerName="oc" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786434 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786444 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: E0316 00:19:33.786455 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786463 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786573 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.786586 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerName="ovnkube-controller" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.789413 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.884923 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.884970 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.884994 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885013 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885037 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885062 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885088 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885110 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885116 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885141 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885181 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885186 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885217 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket" (OuterVolumeSpecName: "log-socket") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885216 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log" (OuterVolumeSpecName: "node-log") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885247 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885237 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885289 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885313 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885227 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885266 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885337 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885362 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885375 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885376 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885385 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885405 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash" (OuterVolumeSpecName: "host-slash") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885452 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885489 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885633 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885681 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885697 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885714 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") pod \"f055dad5-7c9b-46a1-a715-34847c30d0cf\" (UID: \"f055dad5-7c9b-46a1-a715-34847c30d0cf\") " Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885876 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.885993 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886007 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886158 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-var-lib-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886186 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-kubelet\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886203 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-systemd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886250 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9hd\" (UniqueName: \"kubernetes.io/projected/728b696b-af39-40e1-9f49-eb3f9ab1f87d-kube-api-access-nm9hd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886268 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-systemd-units\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886285 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886296 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886349 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-log-socket\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886366 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-env-overrides\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886391 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886413 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-bin\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886438 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-node-log\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886476 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-script-lib\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886551 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-etc-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886589 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-netns\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886718 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-ovn\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886829 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-netd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886933 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-slash\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.886984 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovn-node-metrics-cert\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887082 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-config\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887133 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887277 4983 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887307 4983 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887327 4983 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-node-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887346 4983 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887363 4983 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887379 4983 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887397 4983 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-log-socket\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887415 4983 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887437 4983 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887459 4983 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887477 4983 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887492 4983 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887509 4983 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-slash\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887525 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887541 4983 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887561 4983 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f055dad5-7c9b-46a1-a715-34847c30d0cf-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.887577 4983 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.891241 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.891339 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k" (OuterVolumeSpecName: "kube-api-access-88s5k") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "kube-api-access-88s5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.898262 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f055dad5-7c9b-46a1-a715-34847c30d0cf" (UID: "f055dad5-7c9b-46a1-a715-34847c30d0cf"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988587 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-etc-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988641 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-netns\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988675 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-ovn\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988701 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-netd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988730 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-slash\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988767 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovn-node-metrics-cert\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988795 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-config\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988815 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988851 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-var-lib-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-ovn\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988877 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-kubelet\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988867 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-slash\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988899 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-systemd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988908 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-var-lib-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988930 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-systemd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988939 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9hd\" (UniqueName: \"kubernetes.io/projected/728b696b-af39-40e1-9f49-eb3f9ab1f87d-kube-api-access-nm9hd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988813 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-etc-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989001 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-netd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989010 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-systemd-units\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988898 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988830 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-run-netns\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989077 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-systemd-units\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989079 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989175 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-log-socket\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.988960 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-kubelet\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989113 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-run-openvswitch\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989194 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-env-overrides\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989251 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-log-socket\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989362 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989392 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-bin\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989451 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989455 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-host-cni-bin\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989494 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-node-log\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989523 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/728b696b-af39-40e1-9f49-eb3f9ab1f87d-node-log\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989546 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-script-lib\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989943 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-env-overrides\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990278 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-script-lib\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.989637 4983 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f055dad5-7c9b-46a1-a715-34847c30d0cf-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990334 4983 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f055dad5-7c9b-46a1-a715-34847c30d0cf-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990347 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88s5k\" (UniqueName: \"kubernetes.io/projected/f055dad5-7c9b-46a1-a715-34847c30d0cf-kube-api-access-88s5k\") on node \"crc\" DevicePath \"\"" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.990368 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovnkube-config\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:33 crc kubenswrapper[4983]: I0316 00:19:33.993706 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/728b696b-af39-40e1-9f49-eb3f9ab1f87d-ovn-node-metrics-cert\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.009462 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9hd\" (UniqueName: \"kubernetes.io/projected/728b696b-af39-40e1-9f49-eb3f9ab1f87d-kube-api-access-nm9hd\") pod \"ovnkube-node-b29wv\" (UID: \"728b696b-af39-40e1-9f49-eb3f9ab1f87d\") " pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.102468 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.490740 4983 generic.go:334] "Generic (PLEG): container finished" podID="728b696b-af39-40e1-9f49-eb3f9ab1f87d" containerID="23bf7874921506f9febc4cf6cbd0f358df99a2c8a12ee98a60f0365637a382da" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.490817 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerDied","Data":"23bf7874921506f9febc4cf6cbd0f358df99a2c8a12ee98a60f0365637a382da"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.491109 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"a1c8c5159e9520ea4c6fa20ab275e7ae7cb25edb60436880e7ad5d9a31900897"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.493227 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovnkube-controller/3.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.495581 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-acl-logging/0.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496098 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wsfb4_f055dad5-7c9b-46a1-a715-34847c30d0cf/ovn-controller/0.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496554 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496588 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496607 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496621 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496625 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496631 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496640 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496650 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" exitCode=0 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496654 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496640 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496703 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496719 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496658 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" exitCode=143 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496731 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496741 4983 generic.go:334] "Generic (PLEG): container finished" podID="f055dad5-7c9b-46a1-a715-34847c30d0cf" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" exitCode=143 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496743 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496790 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496667 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496798 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496923 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496940 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496948 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496960 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496967 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496974 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.496994 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497018 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497028 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497035 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497041 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497047 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497053 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497060 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497067 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497074 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497082 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497092 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497103 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497112 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497120 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497127 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497134 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497140 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497147 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497153 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497160 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497167 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497177 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wsfb4" event={"ID":"f055dad5-7c9b-46a1-a715-34847c30d0cf","Type":"ContainerDied","Data":"a0c448e461e6c3fe1b265793bab80821f3f4c31a789d62361e918982254116d6"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497189 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497198 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497205 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497212 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497218 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497225 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497232 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497238 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497245 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.497253 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498421 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/2.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498781 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498812 4983 generic.go:334] "Generic (PLEG): container finished" podID="f81ec143-6c51-4f96-ae71-a4759bac7c70" containerID="1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da" exitCode=2 Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498835 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerDied","Data":"1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.498851 4983 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9"} Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.499157 4983 scope.go:117] "RemoveContainer" containerID="1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.499334 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-tqncp_openshift-multus(f81ec143-6c51-4f96-ae71-a4759bac7c70)\"" pod="openshift-multus/multus-tqncp" podUID="f81ec143-6c51-4f96-ae71-a4759bac7c70" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.524653 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.540387 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wsfb4"] Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.544072 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wsfb4"] Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.563061 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.607813 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.621710 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.636092 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.649375 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.661348 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.710562 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.726519 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.742257 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.742852 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.742897 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.742929 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.743350 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.743404 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.743442 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.744047 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744405 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744422 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.744835 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744875 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.744896 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.745490 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745520 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745537 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.745836 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745863 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.745879 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.746176 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746207 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746222 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.746625 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746654 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.746676 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.747089 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747127 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747146 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: E0316 00:19:34.747509 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747532 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747546 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747949 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.747967 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748243 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748257 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748503 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.748521 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749472 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749492 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749911 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.749936 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750271 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750292 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750546 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750565 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750805 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.750828 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751088 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751108 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751420 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751438 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751669 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.751686 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752190 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752213 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752660 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752681 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.752915 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753113 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753380 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753400 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753646 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753666 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753935 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.753986 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754384 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754407 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754695 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.754897 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755257 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755277 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755612 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.755643 4983 scope.go:117] "RemoveContainer" containerID="a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756108 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca"} err="failed to get container status \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": rpc error: code = NotFound desc = could not find container \"a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca\": container with ID starting with a143d57716d6156882cc9536e0e12acfc0cf7ad3d1c2d44400234058ce0b86ca not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756127 4983 scope.go:117] "RemoveContainer" containerID="15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756578 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765"} err="failed to get container status \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": rpc error: code = NotFound desc = could not find container \"15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765\": container with ID starting with 15fefd039d698f41ec4f0d4f0b4c304c4ef364c4396c19918439812abc517765 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.756606 4983 scope.go:117] "RemoveContainer" containerID="f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757072 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad"} err="failed to get container status \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": rpc error: code = NotFound desc = could not find container \"f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad\": container with ID starting with f4c655a6757ce4844239499464548ba66682c42388c993c5a1489d0575f0f5ad not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757096 4983 scope.go:117] "RemoveContainer" containerID="a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757327 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1"} err="failed to get container status \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": rpc error: code = NotFound desc = could not find container \"a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1\": container with ID starting with a196dc724d27f295114b4ab16ffa82752923ba31d227313dcf639434814082d1 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757345 4983 scope.go:117] "RemoveContainer" containerID="5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757568 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c"} err="failed to get container status \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": rpc error: code = NotFound desc = could not find container \"5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c\": container with ID starting with 5353b271f722be9717421b0d8eb5e96fec61b5b487fc66e0dd7befba40e9b29c not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757590 4983 scope.go:117] "RemoveContainer" containerID="cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757842 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61"} err="failed to get container status \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": rpc error: code = NotFound desc = could not find container \"cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61\": container with ID starting with cf4c253b74270f4710a22708c64fddb53e5e289549ce04e18041468ae6ebff61 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.757868 4983 scope.go:117] "RemoveContainer" containerID="f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758084 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab"} err="failed to get container status \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": rpc error: code = NotFound desc = could not find container \"f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab\": container with ID starting with f6d36d8b0d8acf6564f46afa45b8bd596a39f417a1c978de4a28a0011757bcab not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758103 4983 scope.go:117] "RemoveContainer" containerID="903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758307 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144"} err="failed to get container status \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": rpc error: code = NotFound desc = could not find container \"903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144\": container with ID starting with 903992a58edfe30aa0baf72b5c8c83125b568dacb0570d0d52391543af924144 not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758324 4983 scope.go:117] "RemoveContainer" containerID="294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758595 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a"} err="failed to get container status \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": rpc error: code = NotFound desc = could not find container \"294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a\": container with ID starting with 294a90ddac12f7fd4ccfae17c61b181229cca27b8aafe71ea8dbc575c376e90a not found: ID does not exist" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758625 4983 scope.go:117] "RemoveContainer" containerID="7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8" Mar 16 00:19:34 crc kubenswrapper[4983]: I0316 00:19:34.758920 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8"} err="failed to get container status \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": rpc error: code = NotFound desc = could not find container \"7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8\": container with ID starting with 7b03d99c24b52cc308851b5f2ea4c2a9948ddb0a17706fb0d349d80cbf1c1dc8 not found: ID does not exist" Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514076 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"cafac4abaa5ea78029d0f81189400f2c7e33e0a3af5cf98c710f7c4b17f2726e"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514142 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"85f6ac2df96f9517579723f4b30522cd4829b387dce48dfe326c701bf2c145b8"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514239 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"aafbb1172845e63d001271f1bc0b1c8aa3f051bdc2764bf44a65e63399919e17"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514307 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"0d3ab15df6e440889eed1141da42ab71266ef7b836c68ee8422d06493316f458"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514334 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"89005a22df996694d53a88a98c1e3d6e8aec458923c8e8e2bfa6b91a1a70bd39"} Mar 16 00:19:35 crc kubenswrapper[4983]: I0316 00:19:35.514353 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"281e191e6c680a52386c56c840df4b1e717fe00d47ab0983bb24eb4c2092330c"} Mar 16 00:19:36 crc kubenswrapper[4983]: I0316 00:19:36.105805 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f055dad5-7c9b-46a1-a715-34847c30d0cf" path="/var/lib/kubelet/pods/f055dad5-7c9b-46a1-a715-34847c30d0cf/volumes" Mar 16 00:19:37 crc kubenswrapper[4983]: I0316 00:19:37.531369 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"c58a78bdbc722c72145803dc2cbd3b9ef82cbca5dbce9a3d2b91a594548c0d95"} Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.559153 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" event={"ID":"728b696b-af39-40e1-9f49-eb3f9ab1f87d","Type":"ContainerStarted","Data":"8c353e4739fa5108c73a78b99d1acc8c55f379b1c1bfab213c85b7962a208d16"} Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.561707 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.561726 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.589617 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:40 crc kubenswrapper[4983]: I0316 00:19:40.599481 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" podStartSLOduration=7.59944868 podStartE2EDuration="7.59944868s" podCreationTimestamp="2026-03-16 00:19:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:19:40.588706219 +0000 UTC m=+789.188804689" watchObservedRunningTime="2026-03-16 00:19:40.59944868 +0000 UTC m=+789.199547150" Mar 16 00:19:41 crc kubenswrapper[4983]: I0316 00:19:41.566806 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:41 crc kubenswrapper[4983]: I0316 00:19:41.630416 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.092704 4983 scope.go:117] "RemoveContainer" containerID="1ce990601ab37c57875d72edbed61342c2686f343314ce5c6375afee78bda6da" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.607524 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/2.log" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.608160 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/1.log" Mar 16 00:19:47 crc kubenswrapper[4983]: I0316 00:19:47.608239 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tqncp" event={"ID":"f81ec143-6c51-4f96-ae71-a4759bac7c70","Type":"ContainerStarted","Data":"d802bbc6780abcefaa729b7a4287774d5b04d7688b5a98ef4e274499eb75f8ea"} Mar 16 00:19:53 crc kubenswrapper[4983]: I0316 00:19:53.448938 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:19:53 crc kubenswrapper[4983]: I0316 00:19:53.449553 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.220150 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.221306 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.221407 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.226872 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.227031 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.226877 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.327378 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"auto-csr-approver-29560340-664mq\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.429345 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"auto-csr-approver-29560340-664mq\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.454352 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"auto-csr-approver-29560340-664mq\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.540535 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:00 crc kubenswrapper[4983]: I0316 00:20:00.754990 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:20:00 crc kubenswrapper[4983]: W0316 00:20:00.766090 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3356aa9a_4f16_4602_97b0_1118f7e55776.slice/crio-fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720 WatchSource:0}: Error finding container fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720: Status 404 returned error can't find the container with id fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720 Mar 16 00:20:01 crc kubenswrapper[4983]: I0316 00:20:01.709323 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-664mq" event={"ID":"3356aa9a-4f16-4602-97b0-1118f7e55776","Type":"ContainerStarted","Data":"fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720"} Mar 16 00:20:03 crc kubenswrapper[4983]: I0316 00:20:03.721350 4983 generic.go:334] "Generic (PLEG): container finished" podID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerID="ec962f764e58dc18fb35bd2bf73250ec727cbdfcfdec0a585462238f6e2032c9" exitCode=0 Mar 16 00:20:03 crc kubenswrapper[4983]: I0316 00:20:03.721388 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-664mq" event={"ID":"3356aa9a-4f16-4602-97b0-1118f7e55776","Type":"ContainerDied","Data":"ec962f764e58dc18fb35bd2bf73250ec727cbdfcfdec0a585462238f6e2032c9"} Mar 16 00:20:04 crc kubenswrapper[4983]: I0316 00:20:04.123311 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b29wv" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.022323 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.091155 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") pod \"3356aa9a-4f16-4602-97b0-1118f7e55776\" (UID: \"3356aa9a-4f16-4602-97b0-1118f7e55776\") " Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.098643 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq" (OuterVolumeSpecName: "kube-api-access-79clq") pod "3356aa9a-4f16-4602-97b0-1118f7e55776" (UID: "3356aa9a-4f16-4602-97b0-1118f7e55776"). InnerVolumeSpecName "kube-api-access-79clq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.192210 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79clq\" (UniqueName: \"kubernetes.io/projected/3356aa9a-4f16-4602-97b0-1118f7e55776-kube-api-access-79clq\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.737259 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560340-664mq" event={"ID":"3356aa9a-4f16-4602-97b0-1118f7e55776","Type":"ContainerDied","Data":"fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720"} Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.737588 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbcaec4b97f3b18cc867c2775ac9da5a85a4983b8d3fd788b4a601a1a6734720" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.737467 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560340-664mq" Mar 16 00:20:05 crc kubenswrapper[4983]: I0316 00:20:05.946707 4983 scope.go:117] "RemoveContainer" containerID="dad7e1310ab2887413c42ef74324ffee2aab3ebb28bbeaf086bef3c87b2585f9" Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.069256 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.072406 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560334-5n4gc"] Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.100188 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272489bc-7bd4-4421-930d-150816da83b8" path="/var/lib/kubelet/pods/272489bc-7bd4-4421-930d-150816da83b8/volumes" Mar 16 00:20:06 crc kubenswrapper[4983]: I0316 00:20:06.744829 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tqncp_f81ec143-6c51-4f96-ae71-a4759bac7c70/kube-multus/2.log" Mar 16 00:20:10 crc kubenswrapper[4983]: I0316 00:20:10.969248 4983 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.432105 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.434170 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hgt5w" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" containerID="cri-o://f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" gracePeriod=30 Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.512353 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.512836 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.513266 4983 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" cmd=["grpc_health_probe","-addr=:50051"] Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.513309 4983 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-hgt5w" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.776712 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819528 4983 generic.go:334] "Generic (PLEG): container finished" podID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" exitCode=0 Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819575 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010"} Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819605 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgt5w" event={"ID":"b4cf6a4e-082d-473f-8640-b1eb9b6591d2","Type":"ContainerDied","Data":"3bdf27aed1d212bcce22928b3d69b78bec030ff348adbec53a55366b6c0d3541"} Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819623 4983 scope.go:117] "RemoveContainer" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.819741 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgt5w" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.845066 4983 scope.go:117] "RemoveContainer" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.863025 4983 scope.go:117] "RemoveContainer" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.880966 4983 scope.go:117] "RemoveContainer" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.881464 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010\": container with ID starting with f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 not found: ID does not exist" containerID="f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.881498 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010"} err="failed to get container status \"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010\": rpc error: code = NotFound desc = could not find container \"f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010\": container with ID starting with f4eef968135be918dddaf10347d050c0dac2e61ff709cb2b7e4ba2bae08da010 not found: ID does not exist" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.881518 4983 scope.go:117] "RemoveContainer" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.882340 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d\": container with ID starting with 361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d not found: ID does not exist" containerID="361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.882396 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d"} err="failed to get container status \"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d\": rpc error: code = NotFound desc = could not find container \"361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d\": container with ID starting with 361c9cc645851d74a2f40a94df85900a29655e5a062aff83ac40abe89dda580d not found: ID does not exist" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.882432 4983 scope.go:117] "RemoveContainer" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" Mar 16 00:20:19 crc kubenswrapper[4983]: E0316 00:20:19.882837 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50\": container with ID starting with 0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50 not found: ID does not exist" containerID="0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.882874 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50"} err="failed to get container status \"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50\": rpc error: code = NotFound desc = could not find container \"0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50\": container with ID starting with 0b0af7632540b25514bfab7bc05635e0ac2dbc338d9fb477ff08a99ba06f3f50 not found: ID does not exist" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.956983 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") pod \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.957066 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") pod \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.957105 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") pod \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\" (UID: \"b4cf6a4e-082d-473f-8640-b1eb9b6591d2\") " Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.958247 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities" (OuterVolumeSpecName: "utilities") pod "b4cf6a4e-082d-473f-8640-b1eb9b6591d2" (UID: "b4cf6a4e-082d-473f-8640-b1eb9b6591d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.961768 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8" (OuterVolumeSpecName: "kube-api-access-2snn8") pod "b4cf6a4e-082d-473f-8640-b1eb9b6591d2" (UID: "b4cf6a4e-082d-473f-8640-b1eb9b6591d2"). InnerVolumeSpecName "kube-api-access-2snn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:19 crc kubenswrapper[4983]: I0316 00:20:19.989267 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4cf6a4e-082d-473f-8640-b1eb9b6591d2" (UID: "b4cf6a4e-082d-473f-8640-b1eb9b6591d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.058989 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.059055 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2snn8\" (UniqueName: \"kubernetes.io/projected/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-kube-api-access-2snn8\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.059087 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4cf6a4e-082d-473f-8640-b1eb9b6591d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.158246 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:20:20 crc kubenswrapper[4983]: I0316 00:20:20.163802 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgt5w"] Mar 16 00:20:22 crc kubenswrapper[4983]: I0316 00:20:22.100934 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" path="/var/lib/kubelet/pods/b4cf6a4e-082d-473f-8640-b1eb9b6591d2/volumes" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331252 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7"] Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331443 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331454 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331468 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-content" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331474 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-content" Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331489 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerName="oc" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331495 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerName="oc" Mar 16 00:20:23 crc kubenswrapper[4983]: E0316 00:20:23.331504 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-utilities" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331510 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="extract-utilities" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331601 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4cf6a4e-082d-473f-8640-b1eb9b6591d2" containerName="registry-server" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.331611 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" containerName="oc" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.332255 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.334428 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.341361 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7"] Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.414642 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.414708 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.414959 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.447824 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.447887 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.447935 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.448528 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.448596 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18" gracePeriod=600 Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.515519 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.515584 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.515613 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.516196 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.516534 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.539713 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.660144 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846196 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18" exitCode=0 Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846282 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18"} Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846554 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5"} Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.846577 4983 scope.go:117] "RemoveContainer" containerID="e056a1a5ca459a72f9b9d946a266c3a0b85c436266062d48d109316238bb9f2f" Mar 16 00:20:23 crc kubenswrapper[4983]: I0316 00:20:23.853637 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7"] Mar 16 00:20:23 crc kubenswrapper[4983]: W0316 00:20:23.864226 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e5d5e8_e64e_4876_a604_976485b93449.slice/crio-be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58 WatchSource:0}: Error finding container be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58: Status 404 returned error can't find the container with id be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58 Mar 16 00:20:24 crc kubenswrapper[4983]: I0316 00:20:24.855187 4983 generic.go:334] "Generic (PLEG): container finished" podID="d4e5d5e8-e64e-4876-a604-976485b93449" containerID="4038284b7308d1921454212a0595d9066728744e6bdac74a76e70712f46efdca" exitCode=0 Mar 16 00:20:24 crc kubenswrapper[4983]: I0316 00:20:24.855298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"4038284b7308d1921454212a0595d9066728744e6bdac74a76e70712f46efdca"} Mar 16 00:20:24 crc kubenswrapper[4983]: I0316 00:20:24.855534 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerStarted","Data":"be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58"} Mar 16 00:20:25 crc kubenswrapper[4983]: I0316 00:20:25.873473 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerStarted","Data":"634138fc4f69897b591d293e0d6fada5b8c0f16866e672765d9c538b04bfc7af"} Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.494162 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.495464 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.510030 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.554422 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.554737 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.554864 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.655446 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.655504 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.655543 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.656305 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.656418 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.673869 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"redhat-operators-246kv\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.863278 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.879103 4983 generic.go:334] "Generic (PLEG): container finished" podID="d4e5d5e8-e64e-4876-a604-976485b93449" containerID="634138fc4f69897b591d293e0d6fada5b8c0f16866e672765d9c538b04bfc7af" exitCode=0 Mar 16 00:20:26 crc kubenswrapper[4983]: I0316 00:20:26.879337 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"634138fc4f69897b591d293e0d6fada5b8c0f16866e672765d9c538b04bfc7af"} Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.072350 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:27 crc kubenswrapper[4983]: W0316 00:20:27.087979 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fefe96_0a7d_4f0c_ad4c_9ddb1573f5eb.slice/crio-189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a WatchSource:0}: Error finding container 189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a: Status 404 returned error can't find the container with id 189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.885027 4983 generic.go:334] "Generic (PLEG): container finished" podID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerID="cf4b26abd512b4e809f852ba2adfc32b6dc9094135d78aae3f567c7db9c58b4d" exitCode=0 Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.885149 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"cf4b26abd512b4e809f852ba2adfc32b6dc9094135d78aae3f567c7db9c58b4d"} Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.885451 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerStarted","Data":"189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a"} Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.888525 4983 generic.go:334] "Generic (PLEG): container finished" podID="d4e5d5e8-e64e-4876-a604-976485b93449" containerID="e46cd813755a717f51c264df7d3f3ee959849a5fecdfac5897bf8ae16155d088" exitCode=0 Mar 16 00:20:27 crc kubenswrapper[4983]: I0316 00:20:27.888559 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"e46cd813755a717f51c264df7d3f3ee959849a5fecdfac5897bf8ae16155d088"} Mar 16 00:20:28 crc kubenswrapper[4983]: I0316 00:20:28.896351 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerStarted","Data":"7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c"} Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.167205 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.285385 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") pod \"d4e5d5e8-e64e-4876-a604-976485b93449\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.285486 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") pod \"d4e5d5e8-e64e-4876-a604-976485b93449\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.285532 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") pod \"d4e5d5e8-e64e-4876-a604-976485b93449\" (UID: \"d4e5d5e8-e64e-4876-a604-976485b93449\") " Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.287967 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle" (OuterVolumeSpecName: "bundle") pod "d4e5d5e8-e64e-4876-a604-976485b93449" (UID: "d4e5d5e8-e64e-4876-a604-976485b93449"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.295992 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7" (OuterVolumeSpecName: "kube-api-access-tsbd7") pod "d4e5d5e8-e64e-4876-a604-976485b93449" (UID: "d4e5d5e8-e64e-4876-a604-976485b93449"). InnerVolumeSpecName "kube-api-access-tsbd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.309540 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util" (OuterVolumeSpecName: "util") pod "d4e5d5e8-e64e-4876-a604-976485b93449" (UID: "d4e5d5e8-e64e-4876-a604-976485b93449"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.386880 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.386935 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d4e5d5e8-e64e-4876-a604-976485b93449-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.386946 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsbd7\" (UniqueName: \"kubernetes.io/projected/d4e5d5e8-e64e-4876-a604-976485b93449-kube-api-access-tsbd7\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.904947 4983 generic.go:334] "Generic (PLEG): container finished" podID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerID="7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c" exitCode=0 Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.905058 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c"} Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.910902 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" event={"ID":"d4e5d5e8-e64e-4876-a604-976485b93449","Type":"ContainerDied","Data":"be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58"} Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.910939 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be943c076adacabef341397ce15419ddde21b92a1cc39e897f1732c8c8e99f58" Mar 16 00:20:29 crc kubenswrapper[4983]: I0316 00:20:29.910970 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115436 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4"] Mar 16 00:20:30 crc kubenswrapper[4983]: E0316 00:20:30.115623 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="extract" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115635 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="extract" Mar 16 00:20:30 crc kubenswrapper[4983]: E0316 00:20:30.115655 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="util" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115661 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="util" Mar 16 00:20:30 crc kubenswrapper[4983]: E0316 00:20:30.115671 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="pull" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115677 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="pull" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.115792 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4e5d5e8-e64e-4876-a604-976485b93449" containerName="extract" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.116462 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.120336 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.146486 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4"] Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.298554 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.298704 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.298740 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400031 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400094 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400179 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.400911 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.401032 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.423081 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.469435 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.652175 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4"] Mar 16 00:20:30 crc kubenswrapper[4983]: W0316 00:20:30.660895 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48256dd4_332f_4a25_a535_4357e3b8eccb.slice/crio-e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f WatchSource:0}: Error finding container e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f: Status 404 returned error can't find the container with id e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.923798 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x"] Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.925405 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.930163 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x"] Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.935476 4983 generic.go:334] "Generic (PLEG): container finished" podID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerID="37eb925f51cce8332a0d3416e69ecaa63d5963330725cae68d19889ffc78eb0d" exitCode=0 Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.935565 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"37eb925f51cce8332a0d3416e69ecaa63d5963330725cae68d19889ffc78eb0d"} Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.935599 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerStarted","Data":"e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f"} Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.940460 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerStarted","Data":"3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7"} Mar 16 00:20:30 crc kubenswrapper[4983]: I0316 00:20:30.971353 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-246kv" podStartSLOduration=2.538387461 podStartE2EDuration="4.971335593s" podCreationTimestamp="2026-03-16 00:20:26 +0000 UTC" firstStartedPulling="2026-03-16 00:20:27.88713968 +0000 UTC m=+836.487238110" lastFinishedPulling="2026-03-16 00:20:30.320087822 +0000 UTC m=+838.920186242" observedRunningTime="2026-03-16 00:20:30.967474302 +0000 UTC m=+839.567572732" watchObservedRunningTime="2026-03-16 00:20:30.971335593 +0000 UTC m=+839.571434023" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.008150 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.008283 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.008349 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.109587 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.109864 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.109994 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.110631 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.111014 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.132184 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.254067 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.464864 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x"] Mar 16 00:20:31 crc kubenswrapper[4983]: W0316 00:20:31.469602 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8092d7d9_1bb8_44ce_bad9_4f36ba75b349.slice/crio-45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00 WatchSource:0}: Error finding container 45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00: Status 404 returned error can't find the container with id 45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00 Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.946169 4983 generic.go:334] "Generic (PLEG): container finished" podID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerID="b2eedd04857c1bfe63cf7a147ece277e81ccf9d3fd9d5f5dedd1e310bf405781" exitCode=0 Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.946267 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"b2eedd04857c1bfe63cf7a147ece277e81ccf9d3fd9d5f5dedd1e310bf405781"} Mar 16 00:20:31 crc kubenswrapper[4983]: I0316 00:20:31.946503 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerStarted","Data":"45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00"} Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.952913 4983 generic.go:334] "Generic (PLEG): container finished" podID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerID="c4fb851d36c9925cb4f5a450a8ba4c9f9f89ad6c5b9a79d22a1b0e42f3820a1c" exitCode=0 Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.953025 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"c4fb851d36c9925cb4f5a450a8ba4c9f9f89ad6c5b9a79d22a1b0e42f3820a1c"} Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.954581 4983 generic.go:334] "Generic (PLEG): container finished" podID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerID="5da96aafe7ab4e3aff6e9a3e1302bf9ac0b47550440914c2e8ad72b3f672453e" exitCode=0 Mar 16 00:20:32 crc kubenswrapper[4983]: I0316 00:20:32.954606 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"5da96aafe7ab4e3aff6e9a3e1302bf9ac0b47550440914c2e8ad72b3f672453e"} Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.962040 4983 generic.go:334] "Generic (PLEG): container finished" podID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerID="87ae9077a4806cbec3b568db675d0545fefc2a16e8da481d47eb9d7b5ee01c52" exitCode=0 Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.962117 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"87ae9077a4806cbec3b568db675d0545fefc2a16e8da481d47eb9d7b5ee01c52"} Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.964490 4983 generic.go:334] "Generic (PLEG): container finished" podID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerID="6eedef20352e4aad3a2355f26948e4452aa05f0087f89db2be1b74ea509461c1" exitCode=0 Mar 16 00:20:33 crc kubenswrapper[4983]: I0316 00:20:33.964515 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"6eedef20352e4aad3a2355f26948e4452aa05f0087f89db2be1b74ea509461c1"} Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.410862 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.473549 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576384 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") pod \"48256dd4-332f-4a25-a535-4357e3b8eccb\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576437 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") pod \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576460 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") pod \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576494 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") pod \"48256dd4-332f-4a25-a535-4357e3b8eccb\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576513 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") pod \"48256dd4-332f-4a25-a535-4357e3b8eccb\" (UID: \"48256dd4-332f-4a25-a535-4357e3b8eccb\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576548 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") pod \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\" (UID: \"8092d7d9-1bb8-44ce-bad9-4f36ba75b349\") " Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.576984 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle" (OuterVolumeSpecName: "bundle") pod "48256dd4-332f-4a25-a535-4357e3b8eccb" (UID: "48256dd4-332f-4a25-a535-4357e3b8eccb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.577336 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle" (OuterVolumeSpecName: "bundle") pod "8092d7d9-1bb8-44ce-bad9-4f36ba75b349" (UID: "8092d7d9-1bb8-44ce-bad9-4f36ba75b349"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.584959 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8" (OuterVolumeSpecName: "kube-api-access-l47j8") pod "8092d7d9-1bb8-44ce-bad9-4f36ba75b349" (UID: "8092d7d9-1bb8-44ce-bad9-4f36ba75b349"). InnerVolumeSpecName "kube-api-access-l47j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.595937 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5" (OuterVolumeSpecName: "kube-api-access-zz9w5") pod "48256dd4-332f-4a25-a535-4357e3b8eccb" (UID: "48256dd4-332f-4a25-a535-4357e3b8eccb"). InnerVolumeSpecName "kube-api-access-zz9w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.613133 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util" (OuterVolumeSpecName: "util") pod "8092d7d9-1bb8-44ce-bad9-4f36ba75b349" (UID: "8092d7d9-1bb8-44ce-bad9-4f36ba75b349"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.613841 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util" (OuterVolumeSpecName: "util") pod "48256dd4-332f-4a25-a535-4357e3b8eccb" (UID: "48256dd4-332f-4a25-a535-4357e3b8eccb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678073 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678110 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678119 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678128 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz9w5\" (UniqueName: \"kubernetes.io/projected/48256dd4-332f-4a25-a535-4357e3b8eccb-kube-api-access-zz9w5\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678140 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48256dd4-332f-4a25-a535-4357e3b8eccb-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.678150 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l47j8\" (UniqueName: \"kubernetes.io/projected/8092d7d9-1bb8-44ce-bad9-4f36ba75b349-kube-api-access-l47j8\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700444 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700650 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700662 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700671 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700677 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700688 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700694 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700709 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700715 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="util" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700723 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700728 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: E0316 00:20:35.700738 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700743 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="pull" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700838 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8092d7d9-1bb8-44ce-bad9-4f36ba75b349" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.700853 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="48256dd4-332f-4a25-a535-4357e3b8eccb" containerName="extract" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.701513 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.720511 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.779113 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.779188 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.779212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880084 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880145 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880172 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880676 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.880721 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.904782 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"certified-operators-l677b\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.976087 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" event={"ID":"8092d7d9-1bb8-44ce-bad9-4f36ba75b349","Type":"ContainerDied","Data":"45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00"} Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.976128 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45279d172fef6a2e038b6b71ce4fd26e82f50e6e6a99a74bb90861b6dc6e8a00" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.976131 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.978296 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" event={"ID":"48256dd4-332f-4a25-a535-4357e3b8eccb","Type":"ContainerDied","Data":"e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f"} Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.978333 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8fefd7f948a9835f21fcb69e40f4410a62e65982dac1b17c77dbb0c68469a8f" Mar 16 00:20:35 crc kubenswrapper[4983]: I0316 00:20:35.978398 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.012519 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.331251 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.863849 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.863898 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.984284 4983 generic.go:334] "Generic (PLEG): container finished" podID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerID="fdae44ae256e839c75602b525bcc23b96273c95335b8e9ad6fa6615a4eb894ee" exitCode=0 Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.984383 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"fdae44ae256e839c75602b525bcc23b96273c95335b8e9ad6fa6615a4eb894ee"} Mar 16 00:20:36 crc kubenswrapper[4983]: I0316 00:20:36.984864 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerStarted","Data":"2ca390185601a7ed62fc960bc8af54221f40dfad67cd1fd3da407d672363b944"} Mar 16 00:20:37 crc kubenswrapper[4983]: I0316 00:20:37.921145 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-246kv" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" probeResult="failure" output=< Mar 16 00:20:37 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:20:37 crc kubenswrapper[4983]: > Mar 16 00:20:37 crc kubenswrapper[4983]: I0316 00:20:37.991304 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerStarted","Data":"b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084"} Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.933043 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm"] Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.933979 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.936551 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.940088 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.940156 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.940382 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.952851 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm"] Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.998918 4983 generic.go:334] "Generic (PLEG): container finished" podID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerID="b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084" exitCode=0 Mar 16 00:20:38 crc kubenswrapper[4983]: I0316 00:20:38.998966 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084"} Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042016 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042283 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042372 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042640 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.042744 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.064791 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.250648 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:39 crc kubenswrapper[4983]: I0316 00:20:39.497451 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm"] Mar 16 00:20:39 crc kubenswrapper[4983]: W0316 00:20:39.511546 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd45ab45_645e_45d3_a9eb_a3d1392b5f7a.slice/crio-d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d WatchSource:0}: Error finding container d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d: Status 404 returned error can't find the container with id d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.006896 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerStarted","Data":"ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2"} Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.008624 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerStarted","Data":"5be1a7319c157e061cc029f579f1321a2ab9f725df9d59cd85cfdeb7f85614c8"} Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.008650 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerStarted","Data":"d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d"} Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.026053 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l677b" podStartSLOduration=2.557308675 podStartE2EDuration="5.026039425s" podCreationTimestamp="2026-03-16 00:20:35 +0000 UTC" firstStartedPulling="2026-03-16 00:20:36.985835305 +0000 UTC m=+845.585933735" lastFinishedPulling="2026-03-16 00:20:39.454566045 +0000 UTC m=+848.054664485" observedRunningTime="2026-03-16 00:20:40.02549264 +0000 UTC m=+848.625591070" watchObservedRunningTime="2026-03-16 00:20:40.026039425 +0000 UTC m=+848.626137845" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.903892 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8"] Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.905213 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.907193 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-qbx62" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.907404 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.913908 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8"] Mar 16 00:20:40 crc kubenswrapper[4983]: I0316 00:20:40.915926 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.014880 4983 generic.go:334] "Generic (PLEG): container finished" podID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerID="5be1a7319c157e061cc029f579f1321a2ab9f725df9d59cd85cfdeb7f85614c8" exitCode=0 Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.015641 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"5be1a7319c157e061cc029f579f1321a2ab9f725df9d59cd85cfdeb7f85614c8"} Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.021835 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.023167 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.025533 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-z2zf6" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.037047 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.037782 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.038219 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.052824 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.055082 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.077442 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj89v\" (UniqueName: \"kubernetes.io/projected/2af5ec54-bcc4-45f5-839a-135da91513a2-kube-api-access-rj89v\") pod \"obo-prometheus-operator-68bc856cb9-sn6x8\" (UID: \"2af5ec54-bcc4-45f5-839a-135da91513a2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.135892 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-c99mb"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.136570 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.138456 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.138894 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-slwg2" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.153629 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-c99mb"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.183940 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj89v\" (UniqueName: \"kubernetes.io/projected/2af5ec54-bcc4-45f5-839a-135da91513a2-kube-api-access-rj89v\") pod \"obo-prometheus-operator-68bc856cb9-sn6x8\" (UID: \"2af5ec54-bcc4-45f5-839a-135da91513a2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184019 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184109 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184128 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.184151 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.208532 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj89v\" (UniqueName: \"kubernetes.io/projected/2af5ec54-bcc4-45f5-839a-135da91513a2-kube-api-access-rj89v\") pod \"obo-prometheus-operator-68bc856cb9-sn6x8\" (UID: \"2af5ec54-bcc4-45f5-839a-135da91513a2\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.276786 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285628 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7drq\" (UniqueName: \"kubernetes.io/projected/05523d68-53d9-4cc5-a02b-5221a2396606-kube-api-access-x7drq\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285842 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285885 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285945 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.285983 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/05523d68-53d9-4cc5-a02b-5221a2396606-observability-operator-tls\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.286072 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.290573 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.291311 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.291312 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/30d188b9-ab98-47a3-8143-3f58ae611dd6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-kx26j\" (UID: \"30d188b9-ab98-47a3-8143-3f58ae611dd6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.293926 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7e065fa9-405e-452b-bfe7-c4920a8577db-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd\" (UID: \"7e065fa9-405e-452b-bfe7-c4920a8577db\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.340355 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.344241 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dmdpt"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.353236 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dmdpt"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.353353 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.355271 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.356895 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-qzjzq" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.387188 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/05523d68-53d9-4cc5-a02b-5221a2396606-observability-operator-tls\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.387262 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7drq\" (UniqueName: \"kubernetes.io/projected/05523d68-53d9-4cc5-a02b-5221a2396606-kube-api-access-x7drq\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.392667 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/05523d68-53d9-4cc5-a02b-5221a2396606-observability-operator-tls\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.417684 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7drq\" (UniqueName: \"kubernetes.io/projected/05523d68-53d9-4cc5-a02b-5221a2396606-kube-api-access-x7drq\") pod \"observability-operator-59bdc8b94-c99mb\" (UID: \"05523d68-53d9-4cc5-a02b-5221a2396606\") " pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.453673 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.488470 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbpxn\" (UniqueName: \"kubernetes.io/projected/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-kube-api-access-zbpxn\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.488523 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.590258 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbpxn\" (UniqueName: \"kubernetes.io/projected/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-kube-api-access-zbpxn\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.590303 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.592655 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.623586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbpxn\" (UniqueName: \"kubernetes.io/projected/8eb6b056-16ea-46db-b8ea-fd17a717a8e4-kube-api-access-zbpxn\") pod \"perses-operator-5bf474d74f-dmdpt\" (UID: \"8eb6b056-16ea-46db-b8ea-fd17a717a8e4\") " pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.709889 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.718619 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.801110 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-c99mb"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.843671 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd"] Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.852534 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8"] Mar 16 00:20:41 crc kubenswrapper[4983]: W0316 00:20:41.855577 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e065fa9_405e_452b_bfe7_c4920a8577db.slice/crio-9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201 WatchSource:0}: Error finding container 9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201: Status 404 returned error can't find the container with id 9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201 Mar 16 00:20:41 crc kubenswrapper[4983]: I0316 00:20:41.940093 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dmdpt"] Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.024857 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" event={"ID":"2af5ec54-bcc4-45f5-839a-135da91513a2","Type":"ContainerStarted","Data":"027cd6429254dd17ee667c31575d9e723fe4b02dbf7690e108c0f0f84d5046b5"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.026345 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" event={"ID":"7e065fa9-405e-452b-bfe7-c4920a8577db","Type":"ContainerStarted","Data":"9277385c2fbc793cbe6eb1c23d7a7ec7cdae2a2c3bd1193732f8676c914e8201"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.027272 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" event={"ID":"30d188b9-ab98-47a3-8143-3f58ae611dd6","Type":"ContainerStarted","Data":"1c7aa08faa1e0c4d71702c30fee2135c0edeb9c7749dedf3a525b4d9ba16acc0"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.028282 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" event={"ID":"8eb6b056-16ea-46db-b8ea-fd17a717a8e4","Type":"ContainerStarted","Data":"bd56ccc15d253d16cfb3130753611c862d5e0f5dc093b529a4a32320d273acdf"} Mar 16 00:20:42 crc kubenswrapper[4983]: I0316 00:20:42.029150 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" event={"ID":"05523d68-53d9-4cc5-a02b-5221a2396606","Type":"ContainerStarted","Data":"855e71bf1bdda4a6f1e450402d524f07bf3fb40ee40240144e479978a5770db0"} Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.810941 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hdnm6"] Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.812012 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.815713 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.815968 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-2gs2c" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.819542 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.821507 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hdnm6"] Mar 16 00:20:44 crc kubenswrapper[4983]: I0316 00:20:44.946518 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsfb7\" (UniqueName: \"kubernetes.io/projected/5f1c8286-7638-43ad-bfec-fe7210fa4d73-kube-api-access-tsfb7\") pod \"interconnect-operator-5bb49f789d-hdnm6\" (UID: \"5f1c8286-7638-43ad-bfec-fe7210fa4d73\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:45 crc kubenswrapper[4983]: I0316 00:20:45.049686 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsfb7\" (UniqueName: \"kubernetes.io/projected/5f1c8286-7638-43ad-bfec-fe7210fa4d73-kube-api-access-tsfb7\") pod \"interconnect-operator-5bb49f789d-hdnm6\" (UID: \"5f1c8286-7638-43ad-bfec-fe7210fa4d73\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:45 crc kubenswrapper[4983]: I0316 00:20:45.099820 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsfb7\" (UniqueName: \"kubernetes.io/projected/5f1c8286-7638-43ad-bfec-fe7210fa4d73-kube-api-access-tsfb7\") pod \"interconnect-operator-5bb49f789d-hdnm6\" (UID: \"5f1c8286-7638-43ad-bfec-fe7210fa4d73\") " pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:45 crc kubenswrapper[4983]: I0316 00:20:45.141982 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.013041 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.018010 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.078934 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.209425 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.915624 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:46 crc kubenswrapper[4983]: I0316 00:20:46.966538 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.806963 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-b96d44b59-tbkm6"] Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.807619 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.809564 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.809591 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-bxd4d" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.829152 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-b96d44b59-tbkm6"] Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.889765 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-webhook-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.889819 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-apiservice-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.889863 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9sl\" (UniqueName: \"kubernetes.io/projected/7872b362-5118-4058-abba-048e0a81ecff-kube-api-access-vf9sl\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.991103 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-webhook-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.991165 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-apiservice-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.991212 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9sl\" (UniqueName: \"kubernetes.io/projected/7872b362-5118-4058-abba-048e0a81ecff-kube-api-access-vf9sl\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.997829 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-webhook-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:47 crc kubenswrapper[4983]: I0316 00:20:47.998404 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7872b362-5118-4058-abba-048e0a81ecff-apiservice-cert\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:48 crc kubenswrapper[4983]: I0316 00:20:48.015688 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9sl\" (UniqueName: \"kubernetes.io/projected/7872b362-5118-4058-abba-048e0a81ecff-kube-api-access-vf9sl\") pod \"elastic-operator-b96d44b59-tbkm6\" (UID: \"7872b362-5118-4058-abba-048e0a81ecff\") " pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:48 crc kubenswrapper[4983]: I0316 00:20:48.136766 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" Mar 16 00:20:49 crc kubenswrapper[4983]: I0316 00:20:49.885389 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:49 crc kubenswrapper[4983]: I0316 00:20:49.887285 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l677b" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" containerID="cri-o://ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2" gracePeriod=2 Mar 16 00:20:50 crc kubenswrapper[4983]: I0316 00:20:50.168711 4983 generic.go:334] "Generic (PLEG): container finished" podID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerID="ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2" exitCode=0 Mar 16 00:20:50 crc kubenswrapper[4983]: I0316 00:20:50.168782 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2"} Mar 16 00:20:51 crc kubenswrapper[4983]: I0316 00:20:51.084357 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:51 crc kubenswrapper[4983]: I0316 00:20:51.084638 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-246kv" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" containerID="cri-o://3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7" gracePeriod=2 Mar 16 00:20:52 crc kubenswrapper[4983]: I0316 00:20:52.192273 4983 generic.go:334] "Generic (PLEG): container finished" podID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerID="3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7" exitCode=0 Mar 16 00:20:52 crc kubenswrapper[4983]: I0316 00:20:52.192338 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7"} Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.253590 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.268694 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-hdnm6"] Mar 16 00:20:54 crc kubenswrapper[4983]: W0316 00:20:54.281929 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1c8286_7638_43ad_bfec_fe7210fa4d73.slice/crio-fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd WatchSource:0}: Error finding container fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd: Status 404 returned error can't find the container with id fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.288644 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390686 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") pod \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390765 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") pod \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390813 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") pod \"e93de2c7-8794-463c-9a2d-ac74246f35b7\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390869 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") pod \"e93de2c7-8794-463c-9a2d-ac74246f35b7\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390907 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") pod \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\" (UID: \"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.390922 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") pod \"e93de2c7-8794-463c-9a2d-ac74246f35b7\" (UID: \"e93de2c7-8794-463c-9a2d-ac74246f35b7\") " Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.391472 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities" (OuterVolumeSpecName: "utilities") pod "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" (UID: "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.391595 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities" (OuterVolumeSpecName: "utilities") pod "e93de2c7-8794-463c-9a2d-ac74246f35b7" (UID: "e93de2c7-8794-463c-9a2d-ac74246f35b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.404991 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs" (OuterVolumeSpecName: "kube-api-access-ch4cs") pod "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" (UID: "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb"). InnerVolumeSpecName "kube-api-access-ch4cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.405947 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst" (OuterVolumeSpecName: "kube-api-access-xhpst") pod "e93de2c7-8794-463c-9a2d-ac74246f35b7" (UID: "e93de2c7-8794-463c-9a2d-ac74246f35b7"). InnerVolumeSpecName "kube-api-access-xhpst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.423172 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-b96d44b59-tbkm6"] Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.452354 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e93de2c7-8794-463c-9a2d-ac74246f35b7" (UID: "e93de2c7-8794-463c-9a2d-ac74246f35b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492641 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492677 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e93de2c7-8794-463c-9a2d-ac74246f35b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492689 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch4cs\" (UniqueName: \"kubernetes.io/projected/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-kube-api-access-ch4cs\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492701 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhpst\" (UniqueName: \"kubernetes.io/projected/e93de2c7-8794-463c-9a2d-ac74246f35b7-kube-api-access-xhpst\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.492712 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.525465 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" (UID: "a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:54 crc kubenswrapper[4983]: I0316 00:20:54.593817 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.220480 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l677b" event={"ID":"e93de2c7-8794-463c-9a2d-ac74246f35b7","Type":"ContainerDied","Data":"2ca390185601a7ed62fc960bc8af54221f40dfad67cd1fd3da407d672363b944"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.220814 4983 scope.go:117] "RemoveContainer" containerID="ca70d91557ffd10bcdac32d57f09d3183f7bd43b81d72ecd5d63691b5e1903c2" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.220545 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l677b" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.229660 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" event={"ID":"5f1c8286-7638-43ad-bfec-fe7210fa4d73","Type":"ContainerStarted","Data":"fb954ad6b6c151e90996a0456f533d0f17575abf2506669743979dcbb92d5afd"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.235061 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" event={"ID":"2af5ec54-bcc4-45f5-839a-135da91513a2","Type":"ContainerStarted","Data":"16f3a31594ed9eb1ebd9e260b07bcd275a1a8abfe9c17eedc8718739037a5e19"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.239741 4983 generic.go:334] "Generic (PLEG): container finished" podID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerID="5e7d0dbf615618baf160d51d6d79b4dfa32050f849de2fc0ecd095a7b4d3e2ba" exitCode=0 Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.239813 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"5e7d0dbf615618baf160d51d6d79b4dfa32050f849de2fc0ecd095a7b4d3e2ba"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.243902 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" event={"ID":"8eb6b056-16ea-46db-b8ea-fd17a717a8e4","Type":"ContainerStarted","Data":"f88fc03862db31b62c9ab835ee03bb22c6aedc1454fb7f7afc015dbb8acf7663"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.244811 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.247910 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" event={"ID":"7872b362-5118-4058-abba-048e0a81ecff","Type":"ContainerStarted","Data":"94935f58532543c2baf38622cd3e455f27ee3fa634a67932b462375e9b856c00"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.248454 4983 scope.go:117] "RemoveContainer" containerID="b8812e102e4a81f30e5ce1c0f485f8bff4418d56c5b0c04d3dec108534a29084" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.251633 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" event={"ID":"05523d68-53d9-4cc5-a02b-5221a2396606","Type":"ContainerStarted","Data":"40839b5fc575fdad0097b730a2235961ba9936fd606461838603f07be3d289d7"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.252211 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.253907 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.256101 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" event={"ID":"7e065fa9-405e-452b-bfe7-c4920a8577db","Type":"ContainerStarted","Data":"2239846f8ee0496822dcc8e3af5f7aa7292d1c49670430ca478852c50a72b302"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.261581 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-sn6x8" podStartSLOduration=3.247600189 podStartE2EDuration="15.261561379s" podCreationTimestamp="2026-03-16 00:20:40 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.872486144 +0000 UTC m=+850.472584574" lastFinishedPulling="2026-03-16 00:20:53.886447334 +0000 UTC m=+862.486545764" observedRunningTime="2026-03-16 00:20:55.249571974 +0000 UTC m=+863.849670404" watchObservedRunningTime="2026-03-16 00:20:55.261561379 +0000 UTC m=+863.861659809" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.264929 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" event={"ID":"30d188b9-ab98-47a3-8143-3f58ae611dd6","Type":"ContainerStarted","Data":"ab05ea1fcb6362c94ba5714366dcb07eb754c5a462062be9fab0d670bbb77ad4"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.276251 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-246kv" event={"ID":"a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb","Type":"ContainerDied","Data":"189ad11b896e38a9ec020acccd5db23277edbb2d4b0baea800bc4896e8ea296a"} Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.276351 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-246kv" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.292476 4983 scope.go:117] "RemoveContainer" containerID="fdae44ae256e839c75602b525bcc23b96273c95335b8e9ad6fa6615a4eb894ee" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.303699 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.326504 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l677b"] Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.327100 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" podStartSLOduration=2.372363789 podStartE2EDuration="14.327088996s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.963699415 +0000 UTC m=+850.563797845" lastFinishedPulling="2026-03-16 00:20:53.918424622 +0000 UTC m=+862.518523052" observedRunningTime="2026-03-16 00:20:55.316219451 +0000 UTC m=+863.916317881" watchObservedRunningTime="2026-03-16 00:20:55.327088996 +0000 UTC m=+863.927187426" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.343595 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd" podStartSLOduration=2.318380974 podStartE2EDuration="14.343573348s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.873875051 +0000 UTC m=+850.473973481" lastFinishedPulling="2026-03-16 00:20:53.899067425 +0000 UTC m=+862.499165855" observedRunningTime="2026-03-16 00:20:55.338391483 +0000 UTC m=+863.938489923" watchObservedRunningTime="2026-03-16 00:20:55.343573348 +0000 UTC m=+863.943671778" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.359491 4983 scope.go:117] "RemoveContainer" containerID="3b8ac524d21ce5b4ca39d7d424ad8e7fa2a9ecfa77a2cbd3d9c646eb838316b7" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.373896 4983 scope.go:117] "RemoveContainer" containerID="7b01106210ef02c5fc4b2b479cfcc4510c0e3a3e5038d3f280ee033403a51f3c" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.385492 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c86566d45-kx26j" podStartSLOduration=2.190860512 podStartE2EDuration="14.385474307s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.729681631 +0000 UTC m=+850.329780061" lastFinishedPulling="2026-03-16 00:20:53.924295426 +0000 UTC m=+862.524393856" observedRunningTime="2026-03-16 00:20:55.381859352 +0000 UTC m=+863.981957792" watchObservedRunningTime="2026-03-16 00:20:55.385474307 +0000 UTC m=+863.985572737" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.434189 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-c99mb" podStartSLOduration=2.334999669 podStartE2EDuration="14.434168513s" podCreationTimestamp="2026-03-16 00:20:41 +0000 UTC" firstStartedPulling="2026-03-16 00:20:41.827402422 +0000 UTC m=+850.427500852" lastFinishedPulling="2026-03-16 00:20:53.926571266 +0000 UTC m=+862.526669696" observedRunningTime="2026-03-16 00:20:55.426371859 +0000 UTC m=+864.026470309" watchObservedRunningTime="2026-03-16 00:20:55.434168513 +0000 UTC m=+864.034266943" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.458185 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.466110 4983 scope.go:117] "RemoveContainer" containerID="cf4b26abd512b4e809f852ba2adfc32b6dc9094135d78aae3f567c7db9c58b4d" Mar 16 00:20:55 crc kubenswrapper[4983]: I0316 00:20:55.468784 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-246kv"] Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.101419 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" path="/var/lib/kubelet/pods/a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb/volumes" Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.102216 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" path="/var/lib/kubelet/pods/e93de2c7-8794-463c-9a2d-ac74246f35b7/volumes" Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.286737 4983 generic.go:334] "Generic (PLEG): container finished" podID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerID="018414fffea206a4b55f418bf8e92cf5e98ded9ca1caa4406ee97e9a967f5a8e" exitCode=0 Mar 16 00:20:56 crc kubenswrapper[4983]: I0316 00:20:56.286791 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"018414fffea206a4b55f418bf8e92cf5e98ded9ca1caa4406ee97e9a967f5a8e"} Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.118857 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.243592 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") pod \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.243645 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") pod \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.243711 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") pod \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\" (UID: \"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a\") " Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.248152 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle" (OuterVolumeSpecName: "bundle") pod "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" (UID: "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.254259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx" (OuterVolumeSpecName: "kube-api-access-696hx") pod "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" (UID: "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a"). InnerVolumeSpecName "kube-api-access-696hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.258859 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util" (OuterVolumeSpecName: "util") pod "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" (UID: "cd45ab45-645e-45d3-a9eb-a3d1392b5f7a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.314155 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" event={"ID":"cd45ab45-645e-45d3-a9eb-a3d1392b5f7a","Type":"ContainerDied","Data":"d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d"} Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.314457 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7dd8b01edd106061e6d5e680eb573feed395612e08671515bd58dfbd61f800d" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.314411 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.344719 4983 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-util\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.344766 4983 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-bundle\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:58 crc kubenswrapper[4983]: I0316 00:20:58.344780 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-696hx\" (UniqueName: \"kubernetes.io/projected/cd45ab45-645e-45d3-a9eb-a3d1392b5f7a-kube-api-access-696hx\") on node \"crc\" DevicePath \"\"" Mar 16 00:20:59 crc kubenswrapper[4983]: I0316 00:20:59.322599 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" event={"ID":"7872b362-5118-4058-abba-048e0a81ecff","Type":"ContainerStarted","Data":"5c73e1ae7281bfb93b4a40e8701c75b7046f62a853b8f96c7a8ba4ab9015050c"} Mar 16 00:20:59 crc kubenswrapper[4983]: I0316 00:20:59.344396 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-b96d44b59-tbkm6" podStartSLOduration=8.606621344 podStartE2EDuration="12.344376838s" podCreationTimestamp="2026-03-16 00:20:47 +0000 UTC" firstStartedPulling="2026-03-16 00:20:54.445050796 +0000 UTC m=+863.045149226" lastFinishedPulling="2026-03-16 00:20:58.18280629 +0000 UTC m=+866.782904720" observedRunningTime="2026-03-16 00:20:59.341051711 +0000 UTC m=+867.941150141" watchObservedRunningTime="2026-03-16 00:20:59.344376838 +0000 UTC m=+867.944475268" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.786741 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787163 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="pull" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787174 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="pull" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787187 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787195 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787205 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787211 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787217 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="extract" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787223 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="extract" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787229 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787235 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787245 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787250 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-utilities" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787259 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="util" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787265 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="util" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787274 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787279 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="extract-content" Mar 16 00:21:00 crc kubenswrapper[4983]: E0316 00:21:00.787285 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787291 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787390 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8fefe96-0a7d-4f0c-ad4c-9ddb1573f5eb" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787399 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd45ab45-645e-45d3-a9eb-a3d1392b5f7a" containerName="extract" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.787410 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93de2c7-8794-463c-9a2d-ac74246f35b7" containerName="registry-server" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.788105 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790240 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790240 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790877 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790898 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.790906 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791225 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791851 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791911 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-4rm6d" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.791855 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.815994 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.875855 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.875913 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.875990 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876032 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876049 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876134 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876242 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876298 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876328 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876358 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876404 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876435 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876463 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.876529 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977485 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977550 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977588 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977615 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977650 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977678 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977701 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977741 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977782 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977814 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977838 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977857 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977896 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977926 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.977949 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.978508 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.978564 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.979402 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.983065 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.983232 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.984070 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.984492 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:00 crc kubenswrapper[4983]: I0316 00:21:00.996925 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.000590 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.002202 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.003009 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.003845 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.006268 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.009549 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.009907 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.107391 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:01 crc kubenswrapper[4983]: I0316 00:21:01.723181 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-dmdpt" Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.193823 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.208476 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.356829 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerStarted","Data":"71ea766235047027981feea4dc6ee07b3bf940319a59262e0b075fa246950336"} Mar 16 00:21:04 crc kubenswrapper[4983]: I0316 00:21:04.358013 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" event={"ID":"5f1c8286-7638-43ad-bfec-fe7210fa4d73","Type":"ContainerStarted","Data":"d2920cd923d169ed87512c952802f3cd88209f9f6e10e2f0395529589d2811d5"} Mar 16 00:21:06 crc kubenswrapper[4983]: I0316 00:21:06.019055 4983 scope.go:117] "RemoveContainer" containerID="5fe0de833b2b27c1bfe835628ef9c6dca727580c2781fda123b15ad86663176a" Mar 16 00:21:12 crc kubenswrapper[4983]: I0316 00:21:12.143309 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-hdnm6" podStartSLOduration=18.421357421 podStartE2EDuration="28.143294463s" podCreationTimestamp="2026-03-16 00:20:44 +0000 UTC" firstStartedPulling="2026-03-16 00:20:54.301285318 +0000 UTC m=+862.901383748" lastFinishedPulling="2026-03-16 00:21:04.02322236 +0000 UTC m=+872.623320790" observedRunningTime="2026-03-16 00:21:04.372574716 +0000 UTC m=+872.972673146" watchObservedRunningTime="2026-03-16 00:21:12.143294463 +0000 UTC m=+880.743392893" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.154245 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l"] Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.155380 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.156747 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-k4mnx" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.157271 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.159803 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.175550 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l"] Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.302581 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cdf343fd-5723-4e49-ad01-837bb0bbbed2-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.302658 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv5vt\" (UniqueName: \"kubernetes.io/projected/cdf343fd-5723-4e49-ad01-837bb0bbbed2-kube-api-access-fv5vt\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.404356 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cdf343fd-5723-4e49-ad01-837bb0bbbed2-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.404438 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv5vt\" (UniqueName: \"kubernetes.io/projected/cdf343fd-5723-4e49-ad01-837bb0bbbed2-kube-api-access-fv5vt\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.405199 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cdf343fd-5723-4e49-ad01-837bb0bbbed2-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.420931 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv5vt\" (UniqueName: \"kubernetes.io/projected/cdf343fd-5723-4e49-ad01-837bb0bbbed2-kube-api-access-fv5vt\") pod \"cert-manager-operator-controller-manager-5586865c96-5789l\" (UID: \"cdf343fd-5723-4e49-ad01-837bb0bbbed2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:16 crc kubenswrapper[4983]: I0316 00:21:16.480183 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.369597 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.372543 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.373908 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:26 crc kubenswrapper[4983]: E0316 00:21:26.524606 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:26 crc kubenswrapper[4983]: I0316 00:21:26.671358 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:26 crc kubenswrapper[4983]: I0316 00:21:26.684586 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l"] Mar 16 00:21:26 crc kubenswrapper[4983]: I0316 00:21:26.724543 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 16 00:21:27 crc kubenswrapper[4983]: I0316 00:21:27.529216 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" event={"ID":"cdf343fd-5723-4e49-ad01-837bb0bbbed2","Type":"ContainerStarted","Data":"88ba03d885765707fa54832ae7f4ef16e3e4330bf9b580ccf51c7eabc286c451"} Mar 16 00:21:27 crc kubenswrapper[4983]: E0316 00:21:27.530949 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:28 crc kubenswrapper[4983]: E0316 00:21:28.536527 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" Mar 16 00:21:30 crc kubenswrapper[4983]: I0316 00:21:30.547072 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" event={"ID":"cdf343fd-5723-4e49-ad01-837bb0bbbed2","Type":"ContainerStarted","Data":"302eaa980197387ded55e9ed71a52cd946778c6cecd618cf83a5d42430c84019"} Mar 16 00:21:30 crc kubenswrapper[4983]: I0316 00:21:30.565281 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-5789l" podStartSLOduration=11.84433926 podStartE2EDuration="14.56526047s" podCreationTimestamp="2026-03-16 00:21:16 +0000 UTC" firstStartedPulling="2026-03-16 00:21:26.694398407 +0000 UTC m=+895.294496837" lastFinishedPulling="2026-03-16 00:21:29.415319617 +0000 UTC m=+898.015418047" observedRunningTime="2026-03-16 00:21:30.561373668 +0000 UTC m=+899.161472108" watchObservedRunningTime="2026-03-16 00:21:30.56526047 +0000 UTC m=+899.165358900" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.384155 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-jbjkj"] Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.385438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.387940 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.388352 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.389527 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-l52zx" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.395676 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-jbjkj"] Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.548426 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bmw2\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-kube-api-access-9bmw2\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.548504 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.649877 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bmw2\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-kube-api-access-9bmw2\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.649957 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.667229 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.669355 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bmw2\" (UniqueName: \"kubernetes.io/projected/1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef-kube-api-access-9bmw2\") pod \"cert-manager-webhook-6888856db4-jbjkj\" (UID: \"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef\") " pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.699480 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:33 crc kubenswrapper[4983]: I0316 00:21:33.912656 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-jbjkj"] Mar 16 00:21:33 crc kubenswrapper[4983]: W0316 00:21:33.919859 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1bc27_146d_4df6_9e00_7e0cfb7f28ef.slice/crio-bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601 WatchSource:0}: Error finding container bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601: Status 404 returned error can't find the container with id bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601 Mar 16 00:21:34 crc kubenswrapper[4983]: I0316 00:21:34.569574 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" event={"ID":"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef","Type":"ContainerStarted","Data":"bf83c2b5f6c7eac36150a9a2baaabe47679fe50badd2e3d73792a039cfff3601"} Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.509460 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-g9j58"] Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.510291 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.512117 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-87bvp" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.517311 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-g9j58"] Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.671965 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjcrw\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-kube-api-access-hjcrw\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.672036 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.773301 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.773412 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjcrw\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-kube-api-access-hjcrw\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.795386 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjcrw\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-kube-api-access-hjcrw\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.795394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c66c255-b5f4-4c72-8902-7225df93821d-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-g9j58\" (UID: \"4c66c255-b5f4-4c72-8902-7225df93821d\") " pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:35 crc kubenswrapper[4983]: I0316 00:21:35.871868 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" Mar 16 00:21:36 crc kubenswrapper[4983]: I0316 00:21:36.068735 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-g9j58"] Mar 16 00:21:36 crc kubenswrapper[4983]: W0316 00:21:36.078790 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c66c255_b5f4_4c72_8902_7225df93821d.slice/crio-5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a WatchSource:0}: Error finding container 5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a: Status 404 returned error can't find the container with id 5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a Mar 16 00:21:36 crc kubenswrapper[4983]: I0316 00:21:36.593298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" event={"ID":"4c66c255-b5f4-4c72-8902-7225df93821d","Type":"ContainerStarted","Data":"5fb61081742f70061662414eaa7b162bbd1b672e48dcca7273647c0e58314b3a"} Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.899436 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.901560 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.903400 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.903907 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.904132 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.904224 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.921275 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.922708 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.922911 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.922953 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923002 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923038 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923068 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923101 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923127 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923154 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923208 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923432 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:37 crc kubenswrapper[4983]: I0316 00:21:37.923458 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.023981 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024022 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024055 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024076 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024093 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024121 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024139 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024163 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024292 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024317 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024356 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024388 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024409 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024439 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024527 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024689 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024828 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.024963 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.025434 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.025690 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.025718 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.031722 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.040203 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.067414 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"service-telemetry-operator-1-build\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.226366 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.533577 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:38 crc kubenswrapper[4983]: W0316 00:21:38.545107 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8dccbc9_1a91_4587_84d0_7e4171bb6632.slice/crio-a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1 WatchSource:0}: Error finding container a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1: Status 404 returned error can't find the container with id a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1 Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.606242 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" event={"ID":"1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef","Type":"ContainerStarted","Data":"8b44564a30d00f9515a3f3d10097b85c3ac93f1f2f3362e3a7587e9f1cddc12a"} Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.606538 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.608138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" event={"ID":"4c66c255-b5f4-4c72-8902-7225df93821d","Type":"ContainerStarted","Data":"68447b504f8ad1a0f6b4f3a2aae0e78ea502ce8b7bfd4b19a53c2bc53c1805c2"} Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.609140 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerStarted","Data":"a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1"} Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.629601 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" podStartSLOduration=1.152113348 podStartE2EDuration="5.629577602s" podCreationTimestamp="2026-03-16 00:21:33 +0000 UTC" firstStartedPulling="2026-03-16 00:21:33.92251409 +0000 UTC m=+902.522612510" lastFinishedPulling="2026-03-16 00:21:38.399978334 +0000 UTC m=+907.000076764" observedRunningTime="2026-03-16 00:21:38.625341509 +0000 UTC m=+907.225439969" watchObservedRunningTime="2026-03-16 00:21:38.629577602 +0000 UTC m=+907.229676032" Mar 16 00:21:38 crc kubenswrapper[4983]: I0316 00:21:38.647569 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-g9j58" podStartSLOduration=1.328880586 podStartE2EDuration="3.647549942s" podCreationTimestamp="2026-03-16 00:21:35 +0000 UTC" firstStartedPulling="2026-03-16 00:21:36.081671737 +0000 UTC m=+904.681770167" lastFinishedPulling="2026-03-16 00:21:38.400341093 +0000 UTC m=+907.000439523" observedRunningTime="2026-03-16 00:21:38.643576306 +0000 UTC m=+907.243674736" watchObservedRunningTime="2026-03-16 00:21:38.647549942 +0000 UTC m=+907.247648372" Mar 16 00:21:43 crc kubenswrapper[4983]: I0316 00:21:43.702405 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-jbjkj" Mar 16 00:21:44 crc kubenswrapper[4983]: I0316 00:21:44.646212 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerStarted","Data":"0767b6d75d0337c8d943b36c8f8e475d03c2c0b56f536e438a9cbe93495c561e"} Mar 16 00:21:45 crc kubenswrapper[4983]: I0316 00:21:45.654178 4983 generic.go:334] "Generic (PLEG): container finished" podID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerID="85da0c85b3975564bbfc7df9911d364082da5edf592effa9165ac2fcdf52c986" exitCode=0 Mar 16 00:21:45 crc kubenswrapper[4983]: I0316 00:21:45.654292 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerDied","Data":"85da0c85b3975564bbfc7df9911d364082da5edf592effa9165ac2fcdf52c986"} Mar 16 00:21:46 crc kubenswrapper[4983]: I0316 00:21:46.663459 4983 generic.go:334] "Generic (PLEG): container finished" podID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" containerID="0767b6d75d0337c8d943b36c8f8e475d03c2c0b56f536e438a9cbe93495c561e" exitCode=0 Mar 16 00:21:46 crc kubenswrapper[4983]: I0316 00:21:46.663570 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerDied","Data":"0767b6d75d0337c8d943b36c8f8e475d03c2c0b56f536e438a9cbe93495c561e"} Mar 16 00:21:46 crc kubenswrapper[4983]: I0316 00:21:46.668689 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerStarted","Data":"f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd"} Mar 16 00:21:47 crc kubenswrapper[4983]: I0316 00:21:47.976685 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-1-build" podStartSLOduration=5.157988149 podStartE2EDuration="10.976664807s" podCreationTimestamp="2026-03-16 00:21:37 +0000 UTC" firstStartedPulling="2026-03-16 00:21:38.547658216 +0000 UTC m=+907.147756636" lastFinishedPulling="2026-03-16 00:21:44.366334854 +0000 UTC m=+912.966433294" observedRunningTime="2026-03-16 00:21:46.719054661 +0000 UTC m=+915.319153091" watchObservedRunningTime="2026-03-16 00:21:47.976664807 +0000 UTC m=+916.576763237" Mar 16 00:21:47 crc kubenswrapper[4983]: I0316 00:21:47.980664 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:48 crc kubenswrapper[4983]: I0316 00:21:48.680837 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" containerID="cri-o://f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd" gracePeriod=30 Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.660219 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.661848 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.664514 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.664604 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.664605 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.684325 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737319 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737362 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737490 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737549 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737565 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737592 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737626 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737645 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737664 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737689 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737704 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.737846 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839221 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839284 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839310 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839369 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839399 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839420 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839438 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839456 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839474 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839489 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839528 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839553 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839675 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.839925 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840169 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840367 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840376 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.840700 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.841079 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.842082 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.847577 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.852709 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.865642 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"service-telemetry-operator-2-build\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:49 crc kubenswrapper[4983]: I0316 00:21:49.981569 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:21:50 crc kubenswrapper[4983]: I0316 00:21:50.845158 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 16 00:21:51 crc kubenswrapper[4983]: I0316 00:21:51.737286 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerStarted","Data":"b616705931879247088ce47bd9e865c198d5a42dd59830a5a6c9d692a05b1e4c"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.247396 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-cwdn9"] Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.248102 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.250552 4983 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xwfm5" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.256720 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-cwdn9"] Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.268028 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-bound-sa-token\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.268100 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wdnp\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-kube-api-access-5wdnp\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.369543 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-bound-sa-token\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.369726 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wdnp\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-kube-api-access-5wdnp\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.396479 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-bound-sa-token\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.398029 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wdnp\" (UniqueName: \"kubernetes.io/projected/95208db3-d53d-43c0-9b2c-cc4c5b3236d8-kube-api-access-5wdnp\") pod \"cert-manager-545d4d4674-cwdn9\" (UID: \"95208db3-d53d-43c0-9b2c-cc4c5b3236d8\") " pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: E0316 00:21:52.503978 4983 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cd8a8b_9f7b_45aa_ad0a_0c84fd70722e.slice/crio-dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56cd8a8b_9f7b_45aa_ad0a_0c84fd70722e.slice/crio-conmon-dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5.scope\": RecentStats: unable to find data in memory cache]" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.578252 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-cwdn9" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.743950 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_f8dccbc9-1a91-4587-84d0-7e4171bb6632/docker-build/0.log" Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.744570 4983 generic.go:334] "Generic (PLEG): container finished" podID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerID="f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd" exitCode=1 Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.744653 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerDied","Data":"f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.746160 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerStarted","Data":"eb890a8487aac34c05d790c7e0c2ffc24a8e23e29e057ce2dea6cae90f0436c3"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.748876 4983 generic.go:334] "Generic (PLEG): container finished" podID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" containerID="dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5" exitCode=0 Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.748917 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerDied","Data":"dd5e6f0a89d0c30f1ed473612d847f5dd2a8615ae5534f0387176c8fa5f060d5"} Mar 16 00:21:52 crc kubenswrapper[4983]: I0316 00:21:52.998806 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-cwdn9"] Mar 16 00:21:53 crc kubenswrapper[4983]: W0316 00:21:53.005946 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95208db3_d53d_43c0_9b2c_cc4c5b3236d8.slice/crio-3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5 WatchSource:0}: Error finding container 3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5: Status 404 returned error can't find the container with id 3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5 Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.521904 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_f8dccbc9-1a91-4587-84d0-7e4171bb6632/docker-build/0.log" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.523398 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584273 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584314 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584345 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584405 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584457 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584481 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584500 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584516 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584537 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584577 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584621 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584647 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") pod \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\" (UID: \"f8dccbc9-1a91-4587-84d0-7e4171bb6632\") " Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584909 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.584963 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585279 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585351 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585605 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.585892 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.588658 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.589285 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.590236 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.590274 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.590259 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.593058 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w" (OuterVolumeSpecName: "kube-api-access-mkn9w") pod "f8dccbc9-1a91-4587-84d0-7e4171bb6632" (UID: "f8dccbc9-1a91-4587-84d0-7e4171bb6632"). InnerVolumeSpecName "kube-api-access-mkn9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686028 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686060 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686071 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686081 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/f8dccbc9-1a91-4587-84d0-7e4171bb6632-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686089 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686097 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686105 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686114 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686123 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686131 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/f8dccbc9-1a91-4587-84d0-7e4171bb6632-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686138 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkn9w\" (UniqueName: \"kubernetes.io/projected/f8dccbc9-1a91-4587-84d0-7e4171bb6632-kube-api-access-mkn9w\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.686148 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8dccbc9-1a91-4587-84d0-7e4171bb6632-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.756437 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-cwdn9" event={"ID":"95208db3-d53d-43c0-9b2c-cc4c5b3236d8","Type":"ContainerStarted","Data":"d14743bab2e4da6b2538f7db284bd57dde768c5153fe3c6438dcfc742cb1f90c"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.756482 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-cwdn9" event={"ID":"95208db3-d53d-43c0-9b2c-cc4c5b3236d8","Type":"ContainerStarted","Data":"3b5f2da13179be49dc4681871e7ceb1b9386a19105ead33b2c089d7e2c6b90a5"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.759982 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e","Type":"ContainerStarted","Data":"577fcf1cd2a5ecb4f5447e156045b4b6272fafffead51c5a949cf15995e185af"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.760214 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.761588 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_f8dccbc9-1a91-4587-84d0-7e4171bb6632/docker-build/0.log" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.762068 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"f8dccbc9-1a91-4587-84d0-7e4171bb6632","Type":"ContainerDied","Data":"a31730e37358485b8cf94523b53921f2ef3ecc2e2422c2578546b9f85c6460f1"} Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.762096 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.762126 4983 scope.go:117] "RemoveContainer" containerID="f66a180fc92b0bfb814379aff091a386a5fc53cd37520e91fa22093fc36712cd" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.775995 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-cwdn9" podStartSLOduration=1.775977581 podStartE2EDuration="1.775977581s" podCreationTimestamp="2026-03-16 00:21:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:21:53.773570757 +0000 UTC m=+922.373669197" watchObservedRunningTime="2026-03-16 00:21:53.775977581 +0000 UTC m=+922.376076011" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.785570 4983 scope.go:117] "RemoveContainer" containerID="85da0c85b3975564bbfc7df9911d364082da5edf592effa9165ac2fcdf52c986" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.817678 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=13.677571118 podStartE2EDuration="53.817663203s" podCreationTimestamp="2026-03-16 00:21:00 +0000 UTC" firstStartedPulling="2026-03-16 00:21:04.208243549 +0000 UTC m=+872.808341989" lastFinishedPulling="2026-03-16 00:21:44.348335644 +0000 UTC m=+912.948434074" observedRunningTime="2026-03-16 00:21:53.817363645 +0000 UTC m=+922.417462105" watchObservedRunningTime="2026-03-16 00:21:53.817663203 +0000 UTC m=+922.417761633" Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.834730 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:53 crc kubenswrapper[4983]: I0316 00:21:53.842435 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 16 00:21:54 crc kubenswrapper[4983]: I0316 00:21:54.100736 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" path="/var/lib/kubelet/pods/f8dccbc9-1a91-4587-84d0-7e4171bb6632/volumes" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140085 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:22:00 crc kubenswrapper[4983]: E0316 00:22:00.140748 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140775 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" Mar 16 00:22:00 crc kubenswrapper[4983]: E0316 00:22:00.140791 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="manage-dockerfile" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140797 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="manage-dockerfile" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.140903 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dccbc9-1a91-4587-84d0-7e4171bb6632" containerName="docker-build" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.141301 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.145008 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.147851 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.148974 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.152832 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.257747 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"auto-csr-approver-29560342-544h5\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.358949 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"auto-csr-approver-29560342-544h5\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.382505 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"auto-csr-approver-29560342-544h5\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.457697 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.690273 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.800952 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-544h5" event={"ID":"d0d707a0-1b40-4364-9a61-cde76e2c80a1","Type":"ContainerStarted","Data":"03d9f251332a8028f445c59c6b0b7667a3bf4475e58aae573bb71b533169814c"} Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.802124 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerID="eb890a8487aac34c05d790c7e0c2ffc24a8e23e29e057ce2dea6cae90f0436c3" exitCode=0 Mar 16 00:22:00 crc kubenswrapper[4983]: I0316 00:22:00.802155 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"eb890a8487aac34c05d790c7e0c2ffc24a8e23e29e057ce2dea6cae90f0436c3"} Mar 16 00:22:01 crc kubenswrapper[4983]: I0316 00:22:01.811139 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerID="0bcb454de18d51c1deb1d605b1e69cd13d48c30efe37006944940aa52ae50544" exitCode=0 Mar 16 00:22:01 crc kubenswrapper[4983]: I0316 00:22:01.811254 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"0bcb454de18d51c1deb1d605b1e69cd13d48c30efe37006944940aa52ae50544"} Mar 16 00:22:01 crc kubenswrapper[4983]: I0316 00:22:01.840509 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_4c20598e-7255-42e7-9ac2-e6e58a8e9c88/manage-dockerfile/0.log" Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.822004 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerStarted","Data":"ad0129f0fbcf710c158fc5bf41eaefcab120c1afd0eb3d2ef815d5472ca56907"} Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.824437 4983 generic.go:334] "Generic (PLEG): container finished" podID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerID="b6146a6dfae8df822feda4cd12d6f532571e6f993bc0ef397b108d23a0fa9361" exitCode=0 Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.824467 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-544h5" event={"ID":"d0d707a0-1b40-4364-9a61-cde76e2c80a1","Type":"ContainerDied","Data":"b6146a6dfae8df822feda4cd12d6f532571e6f993bc0ef397b108d23a0fa9361"} Mar 16 00:22:02 crc kubenswrapper[4983]: I0316 00:22:02.930410 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=13.930389417 podStartE2EDuration="13.930389417s" podCreationTimestamp="2026-03-16 00:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:22:02.891652774 +0000 UTC m=+931.491751204" watchObservedRunningTime="2026-03-16 00:22:02.930389417 +0000 UTC m=+931.530487857" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.081291 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.213306 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") pod \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\" (UID: \"d0d707a0-1b40-4364-9a61-cde76e2c80a1\") " Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.221085 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4" (OuterVolumeSpecName: "kube-api-access-gqkf4") pod "d0d707a0-1b40-4364-9a61-cde76e2c80a1" (UID: "d0d707a0-1b40-4364-9a61-cde76e2c80a1"). InnerVolumeSpecName "kube-api-access-gqkf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.315007 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqkf4\" (UniqueName: \"kubernetes.io/projected/d0d707a0-1b40-4364-9a61-cde76e2c80a1-kube-api-access-gqkf4\") on node \"crc\" DevicePath \"\"" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.835688 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560342-544h5" event={"ID":"d0d707a0-1b40-4364-9a61-cde76e2c80a1","Type":"ContainerDied","Data":"03d9f251332a8028f445c59c6b0b7667a3bf4475e58aae573bb71b533169814c"} Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.835725 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03d9f251332a8028f445c59c6b0b7667a3bf4475e58aae573bb71b533169814c" Mar 16 00:22:04 crc kubenswrapper[4983]: I0316 00:22:04.835814 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560342-544h5" Mar 16 00:22:05 crc kubenswrapper[4983]: I0316 00:22:05.143877 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:22:05 crc kubenswrapper[4983]: I0316 00:22:05.147832 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560336-6d4qf"] Mar 16 00:22:06 crc kubenswrapper[4983]: I0316 00:22:06.100422 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b56bb064-30c4-4aaf-a4d2-c81006425b62" path="/var/lib/kubelet/pods/b56bb064-30c4-4aaf-a4d2-c81006425b62/volumes" Mar 16 00:22:06 crc kubenswrapper[4983]: I0316 00:22:06.228656 4983 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e" containerName="elasticsearch" probeResult="failure" output=< Mar 16 00:22:06 crc kubenswrapper[4983]: {"timestamp": "2026-03-16T00:22:06+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 16 00:22:06 crc kubenswrapper[4983]: > Mar 16 00:22:06 crc kubenswrapper[4983]: I0316 00:22:06.417315 4983 scope.go:117] "RemoveContainer" containerID="a092715a78836d6cc7d08c15d4c8579198cd91313410de0ab11035815df03f19" Mar 16 00:22:11 crc kubenswrapper[4983]: I0316 00:22:11.359537 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 16 00:22:23 crc kubenswrapper[4983]: I0316 00:22:23.447858 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:23 crc kubenswrapper[4983]: I0316 00:22:23.448296 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:22:53 crc kubenswrapper[4983]: I0316 00:22:53.448849 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:22:53 crc kubenswrapper[4983]: I0316 00:22:53.449404 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.936228 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:11 crc kubenswrapper[4983]: E0316 00:23:11.937153 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerName="oc" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.937174 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerName="oc" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.937689 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" containerName="oc" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.940507 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:11 crc kubenswrapper[4983]: I0316 00:23:11.954981 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.104773 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.104901 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.104979 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.206339 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.206403 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.206459 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.207218 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.207398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.238620 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"community-operators-755c7\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.272741 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:12 crc kubenswrapper[4983]: I0316 00:23:12.587872 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:13 crc kubenswrapper[4983]: I0316 00:23:13.287803 4983 generic.go:334] "Generic (PLEG): container finished" podID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" exitCode=0 Mar 16 00:23:13 crc kubenswrapper[4983]: I0316 00:23:13.287883 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2"} Mar 16 00:23:13 crc kubenswrapper[4983]: I0316 00:23:13.288220 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerStarted","Data":"170d15e1acedf5bba4266a2e8a9f558a7f2f0cc3269fc453b068a5bc544c83b9"} Mar 16 00:23:14 crc kubenswrapper[4983]: I0316 00:23:14.296188 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerStarted","Data":"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba"} Mar 16 00:23:15 crc kubenswrapper[4983]: I0316 00:23:15.303574 4983 generic.go:334] "Generic (PLEG): container finished" podID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" exitCode=0 Mar 16 00:23:15 crc kubenswrapper[4983]: I0316 00:23:15.303616 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba"} Mar 16 00:23:17 crc kubenswrapper[4983]: I0316 00:23:17.316319 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerStarted","Data":"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3"} Mar 16 00:23:17 crc kubenswrapper[4983]: I0316 00:23:17.336394 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-755c7" podStartSLOduration=2.868353231 podStartE2EDuration="6.336380068s" podCreationTimestamp="2026-03-16 00:23:11 +0000 UTC" firstStartedPulling="2026-03-16 00:23:13.289335471 +0000 UTC m=+1001.889433911" lastFinishedPulling="2026-03-16 00:23:16.757362308 +0000 UTC m=+1005.357460748" observedRunningTime="2026-03-16 00:23:17.332267609 +0000 UTC m=+1005.932366069" watchObservedRunningTime="2026-03-16 00:23:17.336380068 +0000 UTC m=+1005.936478498" Mar 16 00:23:20 crc kubenswrapper[4983]: I0316 00:23:20.335275 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerID="ad0129f0fbcf710c158fc5bf41eaefcab120c1afd0eb3d2ef815d5472ca56907" exitCode=0 Mar 16 00:23:20 crc kubenswrapper[4983]: I0316 00:23:20.335341 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"ad0129f0fbcf710c158fc5bf41eaefcab120c1afd0eb3d2ef815d5472ca56907"} Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.627860 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730291 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730398 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730419 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730449 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730485 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730538 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730579 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730608 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730650 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730683 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730729 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730800 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.730836 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") pod \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\" (UID: \"4c20598e-7255-42e7-9ac2-e6e58a8e9c88\") " Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731099 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731085 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731475 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.731795 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.732235 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.733263 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.737700 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.737747 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.742902 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk" (OuterVolumeSpecName: "kube-api-access-dcpqk") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "kube-api-access-dcpqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.767286 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832478 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832734 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832748 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832777 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832790 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832802 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832813 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832824 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.832835 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcpqk\" (UniqueName: \"kubernetes.io/projected/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-kube-api-access-dcpqk\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.923660 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:21 crc kubenswrapper[4983]: I0316 00:23:21.934390 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.273199 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.273248 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.317901 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.352262 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"4c20598e-7255-42e7-9ac2-e6e58a8e9c88","Type":"ContainerDied","Data":"b616705931879247088ce47bd9e865c198d5a42dd59830a5a6c9d692a05b1e4c"} Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.352312 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b616705931879247088ce47bd9e865c198d5a42dd59830a5a6c9d692a05b1e4c" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.352422 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.393543 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:22 crc kubenswrapper[4983]: I0316 00:23:22.554348 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.448805 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.448870 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.448917 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.449464 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.449506 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5" gracePeriod=600 Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.603766 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "4c20598e-7255-42e7-9ac2-e6e58a8e9c88" (UID: "4c20598e-7255-42e7-9ac2-e6e58a8e9c88"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:23 crc kubenswrapper[4983]: I0316 00:23:23.659909 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/4c20598e-7255-42e7-9ac2-e6e58a8e9c88-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.371553 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5" exitCode=0 Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.371626 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5"} Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.372020 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf"} Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.372069 4983 scope.go:117] "RemoveContainer" containerID="c46c4de7c35ace23617ad378a775dc0cdbe9c0cb791abead202e26dd6d103d18" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.372086 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-755c7" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" containerID="cri-o://6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" gracePeriod=2 Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.719781 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.873818 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") pod \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.873910 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") pod \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.874160 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") pod \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\" (UID: \"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35\") " Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.875148 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities" (OuterVolumeSpecName: "utilities") pod "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" (UID: "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.884033 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc" (OuterVolumeSpecName: "kube-api-access-j9pqc") pod "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" (UID: "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35"). InnerVolumeSpecName "kube-api-access-j9pqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.940570 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" (UID: "15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.975807 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.976103 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:24 crc kubenswrapper[4983]: I0316 00:23:24.976118 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9pqc\" (UniqueName: \"kubernetes.io/projected/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35-kube-api-access-j9pqc\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387240 4983 generic.go:334] "Generic (PLEG): container finished" podID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" exitCode=0 Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387295 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3"} Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387363 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-755c7" event={"ID":"15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35","Type":"ContainerDied","Data":"170d15e1acedf5bba4266a2e8a9f558a7f2f0cc3269fc453b068a5bc544c83b9"} Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387394 4983 scope.go:117] "RemoveContainer" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.387613 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-755c7" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.406107 4983 scope.go:117] "RemoveContainer" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.444789 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.448429 4983 scope.go:117] "RemoveContainer" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.455359 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-755c7"] Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.484043 4983 scope.go:117] "RemoveContainer" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.484580 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3\": container with ID starting with 6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3 not found: ID does not exist" containerID="6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.484608 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3"} err="failed to get container status \"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3\": rpc error: code = NotFound desc = could not find container \"6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3\": container with ID starting with 6e2c7937092b74363a85633c27960e43b94c28505d6c3929567ee3486b2107f3 not found: ID does not exist" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.484631 4983 scope.go:117] "RemoveContainer" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.485144 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba\": container with ID starting with 3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba not found: ID does not exist" containerID="3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.485167 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba"} err="failed to get container status \"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba\": rpc error: code = NotFound desc = could not find container \"3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba\": container with ID starting with 3a67f76a9dc1f0da958c023baa5e64bcc8437cd9406185dff1c73dc8075d11ba not found: ID does not exist" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.485184 4983 scope.go:117] "RemoveContainer" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.485565 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2\": container with ID starting with 21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2 not found: ID does not exist" containerID="21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.485587 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2"} err="failed to get container status \"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2\": rpc error: code = NotFound desc = could not find container \"21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2\": container with ID starting with 21ea3a912eb2b2294f717b8dc4102f6c48850ba990fb51d0eb1afaa0f17ce1b2 not found: ID does not exist" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.959990 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.961505 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="docker-build" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.961652 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="docker-build" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.961896 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-content" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.962017 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-content" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.962159 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="manage-dockerfile" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.962278 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="manage-dockerfile" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.962397 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="git-clone" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.962517 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="git-clone" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.962674 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963079 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" Mar 16 00:23:25 crc kubenswrapper[4983]: E0316 00:23:25.963231 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-utilities" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963344 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="extract-utilities" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963633 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c20598e-7255-42e7-9ac2-e6e58a8e9c88" containerName="docker-build" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.963845 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" containerName="registry-server" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.965160 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.967891 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.968069 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.968515 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.969000 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 16 00:23:25 crc kubenswrapper[4983]: I0316 00:23:25.971265 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.010870 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.011048 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.011168 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.011380 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.101796 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35" path="/var/lib/kubelet/pods/15f9df5e-a9b6-4ab7-8ef6-4dd94eae7a35/volumes" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112277 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112309 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112328 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112346 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112383 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112432 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112512 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112570 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112598 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112625 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112644 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112743 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.112843 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.113045 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.113244 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.118880 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213233 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213546 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213610 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213654 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213679 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213735 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213788 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.213825 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214124 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214151 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214512 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214532 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214605 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.214747 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.222120 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.240354 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"smart-gateway-operator-1-build\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.322589 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:26 crc kubenswrapper[4983]: I0316 00:23:26.538655 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:27 crc kubenswrapper[4983]: I0316 00:23:27.412171 4983 generic.go:334] "Generic (PLEG): container finished" podID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" exitCode=0 Mar 16 00:23:27 crc kubenswrapper[4983]: I0316 00:23:27.412228 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerDied","Data":"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159"} Mar 16 00:23:27 crc kubenswrapper[4983]: I0316 00:23:27.412287 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerStarted","Data":"6801f811951081f606c13f90ca767a1c29a47094846fcf902240672ea47c4ae4"} Mar 16 00:23:28 crc kubenswrapper[4983]: I0316 00:23:28.434708 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerStarted","Data":"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad"} Mar 16 00:23:28 crc kubenswrapper[4983]: I0316 00:23:28.456200 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=3.456184498 podStartE2EDuration="3.456184498s" podCreationTimestamp="2026-03-16 00:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:28.453504247 +0000 UTC m=+1017.053602677" watchObservedRunningTime="2026-03-16 00:23:28.456184498 +0000 UTC m=+1017.056282928" Mar 16 00:23:36 crc kubenswrapper[4983]: I0316 00:23:36.482122 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:36 crc kubenswrapper[4983]: I0316 00:23:36.482932 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" containerID="cri-o://9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" gracePeriod=30 Mar 16 00:23:37 crc kubenswrapper[4983]: I0316 00:23:37.949341 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_d267f9a9-cbff-4b92-8a83-50f8d9bc80e8/docker-build/0.log" Mar 16 00:23:37 crc kubenswrapper[4983]: I0316 00:23:37.950962 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063380 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063457 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063496 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063530 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063565 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063626 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063664 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063704 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063745 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063878 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063917 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.063957 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") pod \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\" (UID: \"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8\") " Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.064411 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.064908 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.064972 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.065314 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.066128 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.066830 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.067204 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.070297 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb" (OuterVolumeSpecName: "kube-api-access-bxmnb") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "kube-api-access-bxmnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.071369 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.075903 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119343 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.119592 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="manage-dockerfile" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119604 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="manage-dockerfile" Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.119614 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119620 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.119737 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerName="docker-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.120695 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.122852 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.123720 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.124493 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165930 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165971 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165984 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.165995 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166006 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166019 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166030 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166043 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxmnb\" (UniqueName: \"kubernetes.io/projected/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-kube-api-access-bxmnb\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166055 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166065 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.166088 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.239821 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266707 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266745 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266792 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266808 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266832 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266861 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266881 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266916 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266932 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266949 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.266971 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.267000 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.267373 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368459 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368508 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368534 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368559 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368589 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368616 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368643 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368704 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368727 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368768 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368799 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368827 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368816 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.368907 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369128 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369179 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369580 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369832 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.369847 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.370671 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.371313 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.372685 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.398049 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.402140 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"smart-gateway-operator-2-build\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.434301 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501249 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_d267f9a9-cbff-4b92-8a83-50f8d9bc80e8/docker-build/0.log" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501909 4983 generic.go:334] "Generic (PLEG): container finished" podID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" exitCode=1 Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501970 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerDied","Data":"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad"} Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.501999 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.502012 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"d267f9a9-cbff-4b92-8a83-50f8d9bc80e8","Type":"ContainerDied","Data":"6801f811951081f606c13f90ca767a1c29a47094846fcf902240672ea47c4ae4"} Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.502041 4983 scope.go:117] "RemoveContainer" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.569530 4983 scope.go:117] "RemoveContainer" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.589995 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" (UID: "d267f9a9-cbff-4b92-8a83-50f8d9bc80e8"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.593542 4983 scope.go:117] "RemoveContainer" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.594099 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad\": container with ID starting with 9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad not found: ID does not exist" containerID="9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.594129 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad"} err="failed to get container status \"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad\": rpc error: code = NotFound desc = could not find container \"9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad\": container with ID starting with 9b27a6536fbf902f88eaa7c9184be204deeb8f76f98b6d6cb38d1cabeb931bad not found: ID does not exist" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.594150 4983 scope.go:117] "RemoveContainer" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" Mar 16 00:23:38 crc kubenswrapper[4983]: E0316 00:23:38.594543 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159\": container with ID starting with bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159 not found: ID does not exist" containerID="bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.594563 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159"} err="failed to get container status \"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159\": rpc error: code = NotFound desc = could not find container \"bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159\": container with ID starting with bc90f002d95ec974c45d5b7ba91d81bd37f001f6ebe6248d4120769aae9c9159 not found: ID does not exist" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.672328 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.729618 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.833564 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:38 crc kubenswrapper[4983]: I0316 00:23:38.837614 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 16 00:23:39 crc kubenswrapper[4983]: I0316 00:23:39.511385 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerStarted","Data":"e4ce240877f3e19ee8e22cd3f7f31da5a9601a66ff499b649538e63303f2f660"} Mar 16 00:23:39 crc kubenswrapper[4983]: I0316 00:23:39.513273 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerStarted","Data":"dbfaff915d0fcafb60574b0839b0d5966137d4dc64cbbc24ce53621c7b3c2439"} Mar 16 00:23:40 crc kubenswrapper[4983]: I0316 00:23:40.098886 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d267f9a9-cbff-4b92-8a83-50f8d9bc80e8" path="/var/lib/kubelet/pods/d267f9a9-cbff-4b92-8a83-50f8d9bc80e8/volumes" Mar 16 00:23:40 crc kubenswrapper[4983]: I0316 00:23:40.519118 4983 generic.go:334] "Generic (PLEG): container finished" podID="8045812f-963d-4b8f-ae8d-584addf74cae" containerID="e4ce240877f3e19ee8e22cd3f7f31da5a9601a66ff499b649538e63303f2f660" exitCode=0 Mar 16 00:23:40 crc kubenswrapper[4983]: I0316 00:23:40.519157 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"e4ce240877f3e19ee8e22cd3f7f31da5a9601a66ff499b649538e63303f2f660"} Mar 16 00:23:41 crc kubenswrapper[4983]: I0316 00:23:41.529159 4983 generic.go:334] "Generic (PLEG): container finished" podID="8045812f-963d-4b8f-ae8d-584addf74cae" containerID="a502b68860c273118e04b350d0e5115354f1da1c30f1f1256f06eaef0f154670" exitCode=0 Mar 16 00:23:41 crc kubenswrapper[4983]: I0316 00:23:41.529346 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"a502b68860c273118e04b350d0e5115354f1da1c30f1f1256f06eaef0f154670"} Mar 16 00:23:41 crc kubenswrapper[4983]: I0316 00:23:41.562321 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_8045812f-963d-4b8f-ae8d-584addf74cae/manage-dockerfile/0.log" Mar 16 00:23:42 crc kubenswrapper[4983]: I0316 00:23:42.541352 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerStarted","Data":"0471081bd69ebbcfb49f768098962d26cbc3cd949b019ccbe48697254a1b6d9c"} Mar 16 00:23:42 crc kubenswrapper[4983]: I0316 00:23:42.575542 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=4.575517022 podStartE2EDuration="4.575517022s" podCreationTimestamp="2026-03-16 00:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:23:42.573490548 +0000 UTC m=+1031.173589018" watchObservedRunningTime="2026-03-16 00:23:42.575517022 +0000 UTC m=+1031.175615472" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.138689 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.139863 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.141931 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.142051 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.142562 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.151229 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.156892 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"auto-csr-approver-29560344-6jrbk\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.258609 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"auto-csr-approver-29560344-6jrbk\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.279337 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"auto-csr-approver-29560344-6jrbk\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.458065 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.645352 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:24:00 crc kubenswrapper[4983]: W0316 00:24:00.650005 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod159f5145_349d_4018_a8d2_251363a76196.slice/crio-38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687 WatchSource:0}: Error finding container 38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687: Status 404 returned error can't find the container with id 38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687 Mar 16 00:24:00 crc kubenswrapper[4983]: I0316 00:24:00.664458 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" event={"ID":"159f5145-349d-4018-a8d2-251363a76196","Type":"ContainerStarted","Data":"38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687"} Mar 16 00:24:02 crc kubenswrapper[4983]: I0316 00:24:02.686248 4983 generic.go:334] "Generic (PLEG): container finished" podID="159f5145-349d-4018-a8d2-251363a76196" containerID="12df34a0b20427d6c21f033a68c9272147f096ee9a9f4ed3b7d0b3054eb18184" exitCode=0 Mar 16 00:24:02 crc kubenswrapper[4983]: I0316 00:24:02.686312 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" event={"ID":"159f5145-349d-4018-a8d2-251363a76196","Type":"ContainerDied","Data":"12df34a0b20427d6c21f033a68c9272147f096ee9a9f4ed3b7d0b3054eb18184"} Mar 16 00:24:03 crc kubenswrapper[4983]: I0316 00:24:03.960237 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.006377 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") pod \"159f5145-349d-4018-a8d2-251363a76196\" (UID: \"159f5145-349d-4018-a8d2-251363a76196\") " Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.011043 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt" (OuterVolumeSpecName: "kube-api-access-hlngt") pod "159f5145-349d-4018-a8d2-251363a76196" (UID: "159f5145-349d-4018-a8d2-251363a76196"). InnerVolumeSpecName "kube-api-access-hlngt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.108117 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlngt\" (UniqueName: \"kubernetes.io/projected/159f5145-349d-4018-a8d2-251363a76196-kube-api-access-hlngt\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.700927 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" event={"ID":"159f5145-349d-4018-a8d2-251363a76196","Type":"ContainerDied","Data":"38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687"} Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.701322 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38b98613a41d0f5a9c0749d014ebd4c6c356efdefa45f481f223214edaece687" Mar 16 00:24:04 crc kubenswrapper[4983]: I0316 00:24:04.701026 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560344-6jrbk" Mar 16 00:24:05 crc kubenswrapper[4983]: I0316 00:24:05.023799 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:24:05 crc kubenswrapper[4983]: I0316 00:24:05.027910 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560338-2jkpl"] Mar 16 00:24:06 crc kubenswrapper[4983]: I0316 00:24:06.101365 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6e333f-fadd-4c92-8db1-b9a923850fa0" path="/var/lib/kubelet/pods/1c6e333f-fadd-4c92-8db1-b9a923850fa0/volumes" Mar 16 00:24:06 crc kubenswrapper[4983]: I0316 00:24:06.490106 4983 scope.go:117] "RemoveContainer" containerID="e40b8ff2ea2fe096fb51ca5ef76f5eab03f687249bde3326f40974dcfd1c4938" Mar 16 00:24:41 crc kubenswrapper[4983]: I0316 00:24:41.923288 4983 generic.go:334] "Generic (PLEG): container finished" podID="8045812f-963d-4b8f-ae8d-584addf74cae" containerID="0471081bd69ebbcfb49f768098962d26cbc3cd949b019ccbe48697254a1b6d9c" exitCode=0 Mar 16 00:24:41 crc kubenswrapper[4983]: I0316 00:24:41.923518 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"0471081bd69ebbcfb49f768098962d26cbc3cd949b019ccbe48697254a1b6d9c"} Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.225993 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251174 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251281 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251309 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251342 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251376 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251419 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251453 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251477 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251540 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251566 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251592 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.251614 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") pod \"8045812f-963d-4b8f-ae8d-584addf74cae\" (UID: \"8045812f-963d-4b8f-ae8d-584addf74cae\") " Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.252134 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.252320 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.252624 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.253109 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.254851 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.257332 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.258360 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.259701 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.281949 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.299686 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl" (OuterVolumeSpecName: "kube-api-access-s7gkl") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "kube-api-access-s7gkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354498 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354541 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354555 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354569 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354582 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/8045812f-963d-4b8f-ae8d-584addf74cae-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354594 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354605 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354617 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/8045812f-963d-4b8f-ae8d-584addf74cae-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354627 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7gkl\" (UniqueName: \"kubernetes.io/projected/8045812f-963d-4b8f-ae8d-584addf74cae-kube-api-access-s7gkl\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.354638 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/8045812f-963d-4b8f-ae8d-584addf74cae-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.507393 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.558137 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.947917 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"8045812f-963d-4b8f-ae8d-584addf74cae","Type":"ContainerDied","Data":"dbfaff915d0fcafb60574b0839b0d5966137d4dc64cbbc24ce53621c7b3c2439"} Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.948028 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbfaff915d0fcafb60574b0839b0d5966137d4dc64cbbc24ce53621c7b3c2439" Mar 16 00:24:43 crc kubenswrapper[4983]: I0316 00:24:43.948150 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 16 00:24:45 crc kubenswrapper[4983]: I0316 00:24:45.181988 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "8045812f-963d-4b8f-ae8d-584addf74cae" (UID: "8045812f-963d-4b8f-ae8d-584addf74cae"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:45 crc kubenswrapper[4983]: I0316 00:24:45.188549 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/8045812f-963d-4b8f-ae8d-584addf74cae-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.637821 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638372 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="manage-dockerfile" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638388 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="manage-dockerfile" Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638402 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159f5145-349d-4018-a8d2-251363a76196" containerName="oc" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638411 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="159f5145-349d-4018-a8d2-251363a76196" containerName="oc" Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638429 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="docker-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638439 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="docker-build" Mar 16 00:24:47 crc kubenswrapper[4983]: E0316 00:24:47.638458 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="git-clone" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638466 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="git-clone" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638619 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="159f5145-349d-4018-a8d2-251363a76196" containerName="oc" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.638640 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="8045812f-963d-4b8f-ae8d-584addf74cae" containerName="docker-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.639443 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.642940 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.643068 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.643298 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.644159 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.651705 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722463 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722515 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722549 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722565 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722584 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722604 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722651 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722679 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722698 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722717 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722732 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.722874 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823772 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823835 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823863 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823886 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823923 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823995 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824108 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824145 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.823923 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824291 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824305 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824329 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824354 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824406 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824474 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824987 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.824994 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.825154 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.825263 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.825586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.826401 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.830215 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.830291 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.856123 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"sg-core-1-build\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " pod="service-telemetry/sg-core-1-build" Mar 16 00:24:47 crc kubenswrapper[4983]: I0316 00:24:47.964409 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:48 crc kubenswrapper[4983]: I0316 00:24:48.167249 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:48 crc kubenswrapper[4983]: I0316 00:24:48.987237 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerStarted","Data":"f4c1f0dfd1f9245a01177dd7e628b5c4473d772529ba0c664747e93e42f10dc2"} Mar 16 00:24:49 crc kubenswrapper[4983]: I0316 00:24:49.996226 4983 generic.go:334] "Generic (PLEG): container finished" podID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" exitCode=0 Mar 16 00:24:49 crc kubenswrapper[4983]: I0316 00:24:49.996273 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerDied","Data":"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792"} Mar 16 00:24:51 crc kubenswrapper[4983]: I0316 00:24:51.006528 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerStarted","Data":"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1"} Mar 16 00:24:51 crc kubenswrapper[4983]: I0316 00:24:51.042396 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=4.042373256 podStartE2EDuration="4.042373256s" podCreationTimestamp="2026-03-16 00:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:24:51.035573565 +0000 UTC m=+1099.635672005" watchObservedRunningTime="2026-03-16 00:24:51.042373256 +0000 UTC m=+1099.642471706" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.047639 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.048529 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" containerID="cri-o://c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" gracePeriod=30 Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.371691 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_d94841dc-28bc-4de0-a8c2-0f64f533a06a/docker-build/0.log" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.372251 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488040 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488136 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488160 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488153 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488199 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488220 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488240 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488263 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488309 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488333 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488433 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488512 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488555 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") pod \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\" (UID: \"d94841dc-28bc-4de0-a8c2-0f64f533a06a\") " Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.488860 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489292 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489364 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489476 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.489831 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.490697 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.491687 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.491712 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.491722 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492176 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492192 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492201 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d94841dc-28bc-4de0-a8c2-0f64f533a06a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.492211 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.494441 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.494522 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.495003 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb" (OuterVolumeSpecName: "kube-api-access-t4fdb") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "kube-api-access-t4fdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.587635 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593136 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593165 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4fdb\" (UniqueName: \"kubernetes.io/projected/d94841dc-28bc-4de0-a8c2-0f64f533a06a-kube-api-access-t4fdb\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593178 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.593189 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/d94841dc-28bc-4de0-a8c2-0f64f533a06a-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.750450 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d94841dc-28bc-4de0-a8c2-0f64f533a06a" (UID: "d94841dc-28bc-4de0-a8c2-0f64f533a06a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:24:58 crc kubenswrapper[4983]: I0316 00:24:58.794736 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d94841dc-28bc-4de0-a8c2-0f64f533a06a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.056081 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_d94841dc-28bc-4de0-a8c2-0f64f533a06a/docker-build/0.log" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057178 4983 generic.go:334] "Generic (PLEG): container finished" podID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" exitCode=1 Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057230 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerDied","Data":"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1"} Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057294 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"d94841dc-28bc-4de0-a8c2-0f64f533a06a","Type":"ContainerDied","Data":"f4c1f0dfd1f9245a01177dd7e628b5c4473d772529ba0c664747e93e42f10dc2"} Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057327 4983 scope.go:117] "RemoveContainer" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.057325 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.103933 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.111460 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.119494 4983 scope.go:117] "RemoveContainer" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.142700 4983 scope.go:117] "RemoveContainer" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.145265 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1\": container with ID starting with c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1 not found: ID does not exist" containerID="c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.145309 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1"} err="failed to get container status \"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1\": rpc error: code = NotFound desc = could not find container \"c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1\": container with ID starting with c26116c9006e036f70156490a208dfba4089a4e62902b340f0c5e568d2dbefd1 not found: ID does not exist" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.145337 4983 scope.go:117] "RemoveContainer" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.145708 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792\": container with ID starting with 70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792 not found: ID does not exist" containerID="70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.145745 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792"} err="failed to get container status \"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792\": rpc error: code = NotFound desc = could not find container \"70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792\": container with ID starting with 70216ee0f3390530f441a60b50088f63907221981b4627b9bb513d9ece7fe792 not found: ID does not exist" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.642131 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.642833 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="manage-dockerfile" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.642869 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="manage-dockerfile" Mar 16 00:24:59 crc kubenswrapper[4983]: E0316 00:24:59.642892 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.642905 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.643089 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" containerName="docker-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.644591 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.647857 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.647939 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.647960 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.648452 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.681249 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712544 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712613 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712661 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712720 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712807 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712846 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.712905 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713016 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713086 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713140 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713193 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.713297 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.813884 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.813942 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.813971 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814001 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814042 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814062 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814099 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814122 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814131 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814158 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814179 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814210 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814235 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.814984 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815383 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815398 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815519 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815701 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.815814 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.826448 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.826450 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.830470 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"sg-core-2-build\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " pod="service-telemetry/sg-core-2-build" Mar 16 00:24:59 crc kubenswrapper[4983]: I0316 00:24:59.967668 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:25:00 crc kubenswrapper[4983]: I0316 00:25:00.100776 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94841dc-28bc-4de0-a8c2-0f64f533a06a" path="/var/lib/kubelet/pods/d94841dc-28bc-4de0-a8c2-0f64f533a06a/volumes" Mar 16 00:25:00 crc kubenswrapper[4983]: I0316 00:25:00.218962 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 16 00:25:01 crc kubenswrapper[4983]: I0316 00:25:01.096192 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerStarted","Data":"27479ee689ee6aea29760c33a15679a69d2edb724c4e782a5da2113d52bcc4b1"} Mar 16 00:25:01 crc kubenswrapper[4983]: I0316 00:25:01.096521 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerStarted","Data":"81b847e03066493ea991298e6d0dc07ee10475361fe583b28e677d8c4d775e55"} Mar 16 00:25:02 crc kubenswrapper[4983]: I0316 00:25:02.105108 4983 generic.go:334] "Generic (PLEG): container finished" podID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerID="27479ee689ee6aea29760c33a15679a69d2edb724c4e782a5da2113d52bcc4b1" exitCode=0 Mar 16 00:25:02 crc kubenswrapper[4983]: I0316 00:25:02.105169 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"27479ee689ee6aea29760c33a15679a69d2edb724c4e782a5da2113d52bcc4b1"} Mar 16 00:25:03 crc kubenswrapper[4983]: I0316 00:25:03.112602 4983 generic.go:334] "Generic (PLEG): container finished" podID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerID="d151b31f17403c86e514fa536efb02ca9d67d7e4d646b4a4907bb6661ac8f39a" exitCode=0 Mar 16 00:25:03 crc kubenswrapper[4983]: I0316 00:25:03.112656 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"d151b31f17403c86e514fa536efb02ca9d67d7e4d646b4a4907bb6661ac8f39a"} Mar 16 00:25:03 crc kubenswrapper[4983]: I0316 00:25:03.156856 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_5b639c92-3fb8-4740-9242-9ced86fc4ad9/manage-dockerfile/0.log" Mar 16 00:25:04 crc kubenswrapper[4983]: I0316 00:25:04.154622 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerStarted","Data":"09b1ea7b6afbc77d3aef59a4c6c789b34bc1851435c29298bef2043739f1f368"} Mar 16 00:25:04 crc kubenswrapper[4983]: I0316 00:25:04.190489 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.190467882 podStartE2EDuration="5.190467882s" podCreationTimestamp="2026-03-16 00:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:25:04.184478703 +0000 UTC m=+1112.784577143" watchObservedRunningTime="2026-03-16 00:25:04.190467882 +0000 UTC m=+1112.790566332" Mar 16 00:25:53 crc kubenswrapper[4983]: I0316 00:25:53.448871 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:25:53 crc kubenswrapper[4983]: I0316 00:25:53.449433 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.138091 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.143014 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.147282 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.147616 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.148543 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.154141 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.178008 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"auto-csr-approver-29560346-xpbzq\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.279614 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"auto-csr-approver-29560346-xpbzq\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.301235 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"auto-csr-approver-29560346-xpbzq\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.459277 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.738668 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:26:00 crc kubenswrapper[4983]: W0316 00:26:00.753182 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cb0fb90_2fa7_4376_8997_678868e0832a.slice/crio-9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff WatchSource:0}: Error finding container 9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff: Status 404 returned error can't find the container with id 9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff Mar 16 00:26:00 crc kubenswrapper[4983]: I0316 00:26:00.801830 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" event={"ID":"3cb0fb90-2fa7-4376-8997-678868e0832a","Type":"ContainerStarted","Data":"9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff"} Mar 16 00:26:02 crc kubenswrapper[4983]: I0316 00:26:02.814655 4983 generic.go:334] "Generic (PLEG): container finished" podID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerID="2ff36755f0536b7a806a46f7132557ff7e7da3301af8403a26d90d34c8f6b2e8" exitCode=0 Mar 16 00:26:02 crc kubenswrapper[4983]: I0316 00:26:02.814713 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" event={"ID":"3cb0fb90-2fa7-4376-8997-678868e0832a","Type":"ContainerDied","Data":"2ff36755f0536b7a806a46f7132557ff7e7da3301af8403a26d90d34c8f6b2e8"} Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.030170 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.132229 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") pod \"3cb0fb90-2fa7-4376-8997-678868e0832a\" (UID: \"3cb0fb90-2fa7-4376-8997-678868e0832a\") " Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.137456 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68" (OuterVolumeSpecName: "kube-api-access-fws68") pod "3cb0fb90-2fa7-4376-8997-678868e0832a" (UID: "3cb0fb90-2fa7-4376-8997-678868e0832a"). InnerVolumeSpecName "kube-api-access-fws68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.233603 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fws68\" (UniqueName: \"kubernetes.io/projected/3cb0fb90-2fa7-4376-8997-678868e0832a-kube-api-access-fws68\") on node \"crc\" DevicePath \"\"" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.827143 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" event={"ID":"3cb0fb90-2fa7-4376-8997-678868e0832a","Type":"ContainerDied","Data":"9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff"} Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.827178 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb6edb722ad22c1ef6bd170e5533d8c3f2fc40539d090a7f39fc9463b9833ff" Mar 16 00:26:04 crc kubenswrapper[4983]: I0316 00:26:04.827195 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560346-xpbzq" Mar 16 00:26:05 crc kubenswrapper[4983]: I0316 00:26:05.085787 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:26:05 crc kubenswrapper[4983]: I0316 00:26:05.092888 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560340-664mq"] Mar 16 00:26:06 crc kubenswrapper[4983]: I0316 00:26:06.110205 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3356aa9a-4f16-4602-97b0-1118f7e55776" path="/var/lib/kubelet/pods/3356aa9a-4f16-4602-97b0-1118f7e55776/volumes" Mar 16 00:26:06 crc kubenswrapper[4983]: I0316 00:26:06.592613 4983 scope.go:117] "RemoveContainer" containerID="ec962f764e58dc18fb35bd2bf73250ec727cbdfcfdec0a585462238f6e2032c9" Mar 16 00:26:23 crc kubenswrapper[4983]: I0316 00:26:23.448430 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:26:23 crc kubenswrapper[4983]: I0316 00:26:23.449100 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.448481 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449015 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449062 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449686 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:26:53 crc kubenswrapper[4983]: I0316 00:26:53.449738 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf" gracePeriod=600 Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.181822 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf" exitCode=0 Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.181893 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf"} Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.182348 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350"} Mar 16 00:26:54 crc kubenswrapper[4983]: I0316 00:26:54.182377 4983 scope.go:117] "RemoveContainer" containerID="46c022992a1c1aeeb47c6d405474573b981c1ce7a0658e8eab3f5cf112a6afc5" Mar 16 00:27:57 crc kubenswrapper[4983]: I0316 00:27:57.591354 4983 generic.go:334] "Generic (PLEG): container finished" podID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerID="09b1ea7b6afbc77d3aef59a4c6c789b34bc1851435c29298bef2043739f1f368" exitCode=0 Mar 16 00:27:57 crc kubenswrapper[4983]: I0316 00:27:57.591438 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"09b1ea7b6afbc77d3aef59a4c6c789b34bc1851435c29298bef2043739f1f368"} Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.856245 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898650 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898731 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898802 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898829 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898850 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898873 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898888 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898908 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898923 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898941 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898959 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.898976 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") pod \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\" (UID: \"5b639c92-3fb8-4740-9242-9ced86fc4ad9\") " Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.900044 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.900552 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.900952 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.901325 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.901778 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.902104 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.905355 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.905865 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg" (OuterVolumeSpecName: "kube-api-access-mhjkg") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "kube-api-access-mhjkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.906877 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:27:58 crc kubenswrapper[4983]: I0316 00:27:58.913638 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.000573 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.000882 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.000955 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001014 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001071 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001127 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/5b639c92-3fb8-4740-9242-9ced86fc4ad9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001183 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001250 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhjkg\" (UniqueName: \"kubernetes.io/projected/5b639c92-3fb8-4740-9242-9ced86fc4ad9-kube-api-access-mhjkg\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001309 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/5b639c92-3fb8-4740-9242-9ced86fc4ad9-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.001366 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.249520 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.305546 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.607326 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"5b639c92-3fb8-4740-9242-9ced86fc4ad9","Type":"ContainerDied","Data":"81b847e03066493ea991298e6d0dc07ee10475361fe583b28e677d8c4d775e55"} Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.607656 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b847e03066493ea991298e6d0dc07ee10475361fe583b28e677d8c4d775e55" Mar 16 00:27:59 crc kubenswrapper[4983]: I0316 00:27:59.607396 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144007 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144303 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerName="oc" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144317 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerName="oc" Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144339 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="git-clone" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144346 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="git-clone" Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144353 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="manage-dockerfile" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144361 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="manage-dockerfile" Mar 16 00:28:00 crc kubenswrapper[4983]: E0316 00:28:00.144377 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="docker-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144384 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="docker-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144489 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" containerName="oc" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.144499 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b639c92-3fb8-4740-9242-9ced86fc4ad9" containerName="docker-build" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.145284 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.147739 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.147896 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.148072 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.149233 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.216401 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"auto-csr-approver-29560348-bfz7x\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.317339 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"auto-csr-approver-29560348-bfz7x\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.332450 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"auto-csr-approver-29560348-bfz7x\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.475017 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.657404 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:28:00 crc kubenswrapper[4983]: I0316 00:28:00.662879 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:28:01 crc kubenswrapper[4983]: I0316 00:28:01.387310 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "5b639c92-3fb8-4740-9242-9ced86fc4ad9" (UID: "5b639c92-3fb8-4740-9242-9ced86fc4ad9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:01 crc kubenswrapper[4983]: I0316 00:28:01.445108 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/5b639c92-3fb8-4740-9242-9ced86fc4ad9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:01 crc kubenswrapper[4983]: I0316 00:28:01.621198 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" event={"ID":"ee9b49cf-d10f-4047-aa75-b89a01652d64","Type":"ContainerStarted","Data":"a1cb8041e3041462e906e27b823bce28977cc94d8a26168231ac6b952be270f5"} Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.469816 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.472018 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.478295 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.478876 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.479024 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.479104 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.494647 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572285 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572345 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572418 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572451 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572483 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572530 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572571 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572730 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572814 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572849 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572888 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.572933 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.634895 4983 generic.go:334] "Generic (PLEG): container finished" podID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerID="78e4a4ba6d978cbff10ef4b981fae6b3ca9ed9e86332628a7ba7f28e90fd7ea7" exitCode=0 Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.634948 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" event={"ID":"ee9b49cf-d10f-4047-aa75-b89a01652d64","Type":"ContainerDied","Data":"78e4a4ba6d978cbff10ef4b981fae6b3ca9ed9e86332628a7ba7f28e90fd7ea7"} Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.674220 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.674303 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.674327 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675178 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675704 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675785 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.675864 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676653 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676683 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676714 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.676736 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677079 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677126 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677186 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677254 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677276 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.677331 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.678387 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.678893 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.678949 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.696359 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.697482 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.698092 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.701057 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"sg-bridge-1-build\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:03 crc kubenswrapper[4983]: I0316 00:28:03.787971 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.243039 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.642510 4983 generic.go:334] "Generic (PLEG): container finished" podID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerID="e7a4f48409b5b7c54ae90135f67eb66bb8944343dc341edc9f4aefa5c3b5777f" exitCode=0 Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.643268 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerDied","Data":"e7a4f48409b5b7c54ae90135f67eb66bb8944343dc341edc9f4aefa5c3b5777f"} Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.643293 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerStarted","Data":"0210089ac19e5c094dcc7240af8c61cd94f591ab00ebf2020e2d3bf726eca0d6"} Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.912246 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:04 crc kubenswrapper[4983]: I0316 00:28:04.997269 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") pod \"ee9b49cf-d10f-4047-aa75-b89a01652d64\" (UID: \"ee9b49cf-d10f-4047-aa75-b89a01652d64\") " Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.002354 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7" (OuterVolumeSpecName: "kube-api-access-tm2z7") pod "ee9b49cf-d10f-4047-aa75-b89a01652d64" (UID: "ee9b49cf-d10f-4047-aa75-b89a01652d64"). InnerVolumeSpecName "kube-api-access-tm2z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.098817 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm2z7\" (UniqueName: \"kubernetes.io/projected/ee9b49cf-d10f-4047-aa75-b89a01652d64-kube-api-access-tm2z7\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.651881 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerStarted","Data":"591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a"} Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.654272 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" event={"ID":"ee9b49cf-d10f-4047-aa75-b89a01652d64","Type":"ContainerDied","Data":"a1cb8041e3041462e906e27b823bce28977cc94d8a26168231ac6b952be270f5"} Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.654324 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1cb8041e3041462e906e27b823bce28977cc94d8a26168231ac6b952be270f5" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.654338 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560348-bfz7x" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.683423 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.6834096130000002 podStartE2EDuration="2.683409613s" podCreationTimestamp="2026-03-16 00:28:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:28:05.676981862 +0000 UTC m=+1294.277080292" watchObservedRunningTime="2026-03-16 00:28:05.683409613 +0000 UTC m=+1294.283508033" Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.985893 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:28:05 crc kubenswrapper[4983]: I0316 00:28:05.995198 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560342-544h5"] Mar 16 00:28:06 crc kubenswrapper[4983]: I0316 00:28:06.104998 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0d707a0-1b40-4364-9a61-cde76e2c80a1" path="/var/lib/kubelet/pods/d0d707a0-1b40-4364-9a61-cde76e2c80a1/volumes" Mar 16 00:28:06 crc kubenswrapper[4983]: I0316 00:28:06.696653 4983 scope.go:117] "RemoveContainer" containerID="b6146a6dfae8df822feda4cd12d6f532571e6f993bc0ef397b108d23a0fa9361" Mar 16 00:28:11 crc kubenswrapper[4983]: I0316 00:28:11.694999 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_3c28cfcf-6ddc-4eaa-956a-f42746839382/docker-build/0.log" Mar 16 00:28:11 crc kubenswrapper[4983]: I0316 00:28:11.696466 4983 generic.go:334] "Generic (PLEG): container finished" podID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerID="591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a" exitCode=1 Mar 16 00:28:11 crc kubenswrapper[4983]: I0316 00:28:11.696500 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerDied","Data":"591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a"} Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.039060 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_3c28cfcf-6ddc-4eaa-956a-f42746839382/docker-build/0.log" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.041221 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116621 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116687 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116720 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116744 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116808 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116836 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116863 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116897 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116882 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116923 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.116957 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117006 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117024 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") pod \"3c28cfcf-6ddc-4eaa-956a-f42746839382\" (UID: \"3c28cfcf-6ddc-4eaa-956a-f42746839382\") " Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117265 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117686 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117171 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.117970 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.118166 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.118558 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122010 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122114 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122515 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.122579 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg" (OuterVolumeSpecName: "kube-api-access-8l5lg") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "kube-api-access-8l5lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.194552 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218384 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218434 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218453 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218472 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218492 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218511 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l5lg\" (UniqueName: \"kubernetes.io/projected/3c28cfcf-6ddc-4eaa-956a-f42746839382-kube-api-access-8l5lg\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218527 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218546 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/3c28cfcf-6ddc-4eaa-956a-f42746839382-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218562 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.218583 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/3c28cfcf-6ddc-4eaa-956a-f42746839382-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.498209 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "3c28cfcf-6ddc-4eaa-956a-f42746839382" (UID: "3c28cfcf-6ddc-4eaa-956a-f42746839382"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.523163 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/3c28cfcf-6ddc-4eaa-956a-f42746839382-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.712519 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_3c28cfcf-6ddc-4eaa-956a-f42746839382/docker-build/0.log" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.713203 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"3c28cfcf-6ddc-4eaa-956a-f42746839382","Type":"ContainerDied","Data":"0210089ac19e5c094dcc7240af8c61cd94f591ab00ebf2020e2d3bf726eca0d6"} Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.713261 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0210089ac19e5c094dcc7240af8c61cd94f591ab00ebf2020e2d3bf726eca0d6" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.713311 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.850469 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:13 crc kubenswrapper[4983]: I0316 00:28:13.856219 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 16 00:28:14 crc kubenswrapper[4983]: I0316 00:28:14.104405 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" path="/var/lib/kubelet/pods/3c28cfcf-6ddc-4eaa-956a-f42746839382/volumes" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.519914 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:28:15 crc kubenswrapper[4983]: E0316 00:28:15.520178 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="docker-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520194 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="docker-build" Mar 16 00:28:15 crc kubenswrapper[4983]: E0316 00:28:15.520211 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="manage-dockerfile" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520219 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="manage-dockerfile" Mar 16 00:28:15 crc kubenswrapper[4983]: E0316 00:28:15.520234 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerName="oc" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520242 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerName="oc" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520377 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c28cfcf-6ddc-4eaa-956a-f42746839382" containerName="docker-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.520391 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" containerName="oc" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.521424 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.524170 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.524632 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.524942 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.527636 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.542408 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656117 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656166 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656198 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656216 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656305 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656378 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656415 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656485 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656599 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656659 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656683 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.656715 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758349 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758418 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758456 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758476 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758496 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758526 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758550 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758559 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758588 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758627 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758660 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758687 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758709 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.758798 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759115 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759141 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759270 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759424 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.759518 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.760182 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.763219 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.771385 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.777101 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"sg-bridge-2-build\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:15 crc kubenswrapper[4983]: I0316 00:28:15.845145 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:28:16 crc kubenswrapper[4983]: I0316 00:28:16.303216 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 16 00:28:16 crc kubenswrapper[4983]: I0316 00:28:16.733112 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerStarted","Data":"4b7e8f0b0262d13b5d3dad043009038fc9741d48b180835616abe0b9cb4b0b85"} Mar 16 00:28:16 crc kubenswrapper[4983]: I0316 00:28:16.733460 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerStarted","Data":"1172d46c4823802088437b057d123769d09a25da1fff7fc5d7b9e421e92ae6f5"} Mar 16 00:28:17 crc kubenswrapper[4983]: I0316 00:28:17.741018 4983 generic.go:334] "Generic (PLEG): container finished" podID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerID="4b7e8f0b0262d13b5d3dad043009038fc9741d48b180835616abe0b9cb4b0b85" exitCode=0 Mar 16 00:28:17 crc kubenswrapper[4983]: I0316 00:28:17.741081 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"4b7e8f0b0262d13b5d3dad043009038fc9741d48b180835616abe0b9cb4b0b85"} Mar 16 00:28:18 crc kubenswrapper[4983]: I0316 00:28:18.747690 4983 generic.go:334] "Generic (PLEG): container finished" podID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerID="94ea866889c4c93c4d2c2bf288edba184639a401760736267e888a00e8c78537" exitCode=0 Mar 16 00:28:18 crc kubenswrapper[4983]: I0316 00:28:18.747781 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"94ea866889c4c93c4d2c2bf288edba184639a401760736267e888a00e8c78537"} Mar 16 00:28:18 crc kubenswrapper[4983]: I0316 00:28:18.799006 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_bf3cda5e-7ab3-44d7-baa0-d98b65d0d759/manage-dockerfile/0.log" Mar 16 00:28:19 crc kubenswrapper[4983]: I0316 00:28:19.758998 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerStarted","Data":"3d928fbc21c2d15074f4f972b682df33246ef2426138d87f9d0046e2e737dafb"} Mar 16 00:28:53 crc kubenswrapper[4983]: I0316 00:28:53.448502 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:28:53 crc kubenswrapper[4983]: I0316 00:28:53.449385 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:03 crc kubenswrapper[4983]: I0316 00:29:03.067229 4983 generic.go:334] "Generic (PLEG): container finished" podID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerID="3d928fbc21c2d15074f4f972b682df33246ef2426138d87f9d0046e2e737dafb" exitCode=0 Mar 16 00:29:03 crc kubenswrapper[4983]: I0316 00:29:03.067317 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"3d928fbc21c2d15074f4f972b682df33246ef2426138d87f9d0046e2e737dafb"} Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.352280 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.467848 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468146 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468183 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468270 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468289 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468303 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468324 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468357 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468373 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468389 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468406 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468441 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") pod \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\" (UID: \"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759\") " Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468661 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.468687 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.469748 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.469937 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.470280 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.472550 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.477092 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.477111 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb" (OuterVolumeSpecName: "kube-api-access-zdqfb") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "kube-api-access-zdqfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.477187 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.478646 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569391 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569426 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569470 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569483 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569494 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569505 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdqfb\" (UniqueName: \"kubernetes.io/projected/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-kube-api-access-zdqfb\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569516 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569529 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569539 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.569550 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.579385 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:04 crc kubenswrapper[4983]: I0316 00:29:04.670936 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.084563 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"bf3cda5e-7ab3-44d7-baa0-d98b65d0d759","Type":"ContainerDied","Data":"1172d46c4823802088437b057d123769d09a25da1fff7fc5d7b9e421e92ae6f5"} Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.084605 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.084614 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1172d46c4823802088437b057d123769d09a25da1fff7fc5d7b9e421e92ae6f5" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.223709 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" (UID: "bf3cda5e-7ab3-44d7-baa0-d98b65d0d759"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:05 crc kubenswrapper[4983]: I0316 00:29:05.280359 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/bf3cda5e-7ab3-44d7-baa0-d98b65d0d759-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.457864 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:08 crc kubenswrapper[4983]: E0316 00:29:08.458435 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="git-clone" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458451 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="git-clone" Mar 16 00:29:08 crc kubenswrapper[4983]: E0316 00:29:08.458461 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="manage-dockerfile" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458468 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="manage-dockerfile" Mar 16 00:29:08 crc kubenswrapper[4983]: E0316 00:29:08.458496 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="docker-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458506 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="docker-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.458621 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf3cda5e-7ab3-44d7-baa0-d98b65d0d759" containerName="docker-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.459414 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.461143 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.461183 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.461187 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.462763 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.468741 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521580 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521633 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521655 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521768 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521821 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521859 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521878 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521897 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521916 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521949 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.521971 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.522026 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622330 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622380 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622397 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622431 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622455 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622473 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622487 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622506 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622526 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622552 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622571 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622590 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.622671 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.623345 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624076 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624127 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624306 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.624548 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.625239 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.625662 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.625988 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.629028 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.629483 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.641837 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.775352 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:08 crc kubenswrapper[4983]: I0316 00:29:08.944780 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:09 crc kubenswrapper[4983]: I0316 00:29:09.109172 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerStarted","Data":"e0f3e3718874685653e30f81f0245af34cc90402115bbca57e3bcd0a3130386f"} Mar 16 00:29:10 crc kubenswrapper[4983]: I0316 00:29:10.116370 4983 generic.go:334] "Generic (PLEG): container finished" podID="48198167-8197-43b5-847b-6573fc24f312" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" exitCode=0 Mar 16 00:29:10 crc kubenswrapper[4983]: I0316 00:29:10.116463 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerDied","Data":"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae"} Mar 16 00:29:11 crc kubenswrapper[4983]: I0316 00:29:11.123534 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerStarted","Data":"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970"} Mar 16 00:29:11 crc kubenswrapper[4983]: I0316 00:29:11.146298 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.146280803 podStartE2EDuration="3.146280803s" podCreationTimestamp="2026-03-16 00:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:29:11.141545858 +0000 UTC m=+1359.741644278" watchObservedRunningTime="2026-03-16 00:29:11.146280803 +0000 UTC m=+1359.746379233" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.258935 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.259836 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" containerID="cri-o://b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" gracePeriod=30 Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.685444 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_48198167-8197-43b5-847b-6573fc24f312/docker-build/0.log" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.686461 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780433 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780492 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780528 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780547 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780579 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780602 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780623 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780637 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780668 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780731 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780788 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780807 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") pod \"48198167-8197-43b5-847b-6573fc24f312\" (UID: \"48198167-8197-43b5-847b-6573fc24f312\") " Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780850 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.780977 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.781054 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.781068 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/48198167-8197-43b5-847b-6573fc24f312-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783744 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783901 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783994 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.783982 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.784266 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.789237 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj" (OuterVolumeSpecName: "kube-api-access-wbjsj") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "kube-api-access-wbjsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.789272 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.789535 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.854020 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882923 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882959 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882970 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882980 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/48198167-8197-43b5-847b-6573fc24f312-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.882992 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883004 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883013 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbjsj\" (UniqueName: \"kubernetes.io/projected/48198167-8197-43b5-847b-6573fc24f312-kube-api-access-wbjsj\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883021 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:19 crc kubenswrapper[4983]: I0316 00:29:19.883029 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48198167-8197-43b5-847b-6573fc24f312-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.154880 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "48198167-8197-43b5-847b-6573fc24f312" (UID: "48198167-8197-43b5-847b-6573fc24f312"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180004 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_48198167-8197-43b5-847b-6573fc24f312/docker-build/0.log" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180740 4983 generic.go:334] "Generic (PLEG): container finished" podID="48198167-8197-43b5-847b-6573fc24f312" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" exitCode=1 Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180840 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerDied","Data":"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970"} Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180886 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"48198167-8197-43b5-847b-6573fc24f312","Type":"ContainerDied","Data":"e0f3e3718874685653e30f81f0245af34cc90402115bbca57e3bcd0a3130386f"} Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.180919 4983 scope.go:117] "RemoveContainer" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.181019 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.187995 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/48198167-8197-43b5-847b-6573fc24f312-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.204002 4983 scope.go:117] "RemoveContainer" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.227660 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.231934 4983 scope.go:117] "RemoveContainer" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.232481 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970\": container with ID starting with b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970 not found: ID does not exist" containerID="b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.232548 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970"} err="failed to get container status \"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970\": rpc error: code = NotFound desc = could not find container \"b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970\": container with ID starting with b58d07f46dd043e09a4cfaf3f7e95f956034d068fc6659ebf3fb3fc717fd0970 not found: ID does not exist" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.232587 4983 scope.go:117] "RemoveContainer" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.233208 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae\": container with ID starting with 7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae not found: ID does not exist" containerID="7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.233826 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae"} err="failed to get container status \"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae\": rpc error: code = NotFound desc = could not find container \"7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae\": container with ID starting with 7da7c98a2ebf809106dd1239d2ae0610e7330b5c9ee07869d9dd9941e9c76dae not found: ID does not exist" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.234013 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.890615 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.890962 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="manage-dockerfile" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.890979 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="manage-dockerfile" Mar 16 00:29:20 crc kubenswrapper[4983]: E0316 00:29:20.891008 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.891015 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.891157 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="48198167-8197-43b5-847b-6573fc24f312" containerName="docker-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.892447 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.894285 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-88vdw" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.894704 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.895327 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.895355 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.914917 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999642 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999700 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999766 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999794 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999882 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999910 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:20 crc kubenswrapper[4983]: I0316 00:29:20.999938 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000097 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000159 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000203 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000239 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.000291 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101391 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101440 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101486 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101508 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101633 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101783 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101788 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101918 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101960 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101956 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.101993 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102067 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102100 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102124 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102266 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102336 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102379 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102541 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102612 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102708 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.102996 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.105077 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.114287 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.119262 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.213462 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:29:21 crc kubenswrapper[4983]: I0316 00:29:21.399781 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 16 00:29:22 crc kubenswrapper[4983]: I0316 00:29:22.101918 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48198167-8197-43b5-847b-6573fc24f312" path="/var/lib/kubelet/pods/48198167-8197-43b5-847b-6573fc24f312/volumes" Mar 16 00:29:22 crc kubenswrapper[4983]: I0316 00:29:22.195824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerStarted","Data":"5ba424032e01a5aec1dafe53a9a7d6f1b7244d21b8fbf332aa8bc6d19c1c8863"} Mar 16 00:29:22 crc kubenswrapper[4983]: I0316 00:29:22.195872 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerStarted","Data":"dc3a9ca53d1d789e35b74d5f0511c33239a8145d878bf5432c485cec4d0b52c4"} Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.206454 4983 generic.go:334] "Generic (PLEG): container finished" podID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerID="5ba424032e01a5aec1dafe53a9a7d6f1b7244d21b8fbf332aa8bc6d19c1c8863" exitCode=0 Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.206875 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"5ba424032e01a5aec1dafe53a9a7d6f1b7244d21b8fbf332aa8bc6d19c1c8863"} Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.448694 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:23 crc kubenswrapper[4983]: I0316 00:29:23.448788 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:24 crc kubenswrapper[4983]: I0316 00:29:24.213563 4983 generic.go:334] "Generic (PLEG): container finished" podID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerID="c57595e008cb9889dca6c1a1c59e02696afe2ba9673c999bed193761ef0ca2b8" exitCode=0 Mar 16 00:29:24 crc kubenswrapper[4983]: I0316 00:29:24.213608 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"c57595e008cb9889dca6c1a1c59e02696afe2ba9673c999bed193761ef0ca2b8"} Mar 16 00:29:24 crc kubenswrapper[4983]: I0316 00:29:24.253268 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_de1c6f9c-f134-479d-a5ae-5ab93b30b2e3/manage-dockerfile/0.log" Mar 16 00:29:25 crc kubenswrapper[4983]: I0316 00:29:25.222487 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerStarted","Data":"5f7cbddba7e24ee372140ec3e6d1283660f1e04ffafec527fdafc57d3a279e56"} Mar 16 00:29:25 crc kubenswrapper[4983]: I0316 00:29:25.250426 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.250411828 podStartE2EDuration="5.250411828s" podCreationTimestamp="2026-03-16 00:29:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:29:25.248541608 +0000 UTC m=+1373.848640048" watchObservedRunningTime="2026-03-16 00:29:25.250411828 +0000 UTC m=+1373.850510248" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.448336 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.448899 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.448948 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.449609 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:29:53 crc kubenswrapper[4983]: I0316 00:29:53.449671 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350" gracePeriod=600 Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437517 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350" exitCode=0 Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437587 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350"} Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437916 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0"} Mar 16 00:29:54 crc kubenswrapper[4983]: I0316 00:29:54.437943 4983 scope.go:117] "RemoveContainer" containerID="4952feacc34350796d6119e57c5c2963c6c739a2d4a6116de514f57eada3dedf" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.160025 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.161738 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.164949 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.165614 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.165837 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.171146 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.172371 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.179826 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.180019 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.187857 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.192416 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h"] Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262018 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262083 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262136 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"auto-csr-approver-29560350-spjzd\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.262171 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.362938 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"auto-csr-approver-29560350-spjzd\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.363000 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.363088 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.363118 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.364489 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.384587 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.386895 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"auto-csr-approver-29560350-spjzd\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.389089 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"collect-profiles-29560350-w9t6h\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.493418 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.507471 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.886440 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:30:00 crc kubenswrapper[4983]: W0316 00:30:00.895371 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57c14e51_5c0b_467c_ba79_ac6f39239445.slice/crio-02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989 WatchSource:0}: Error finding container 02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989: Status 404 returned error can't find the container with id 02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989 Mar 16 00:30:00 crc kubenswrapper[4983]: I0316 00:30:00.937408 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h"] Mar 16 00:30:00 crc kubenswrapper[4983]: W0316 00:30:00.939149 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c811eb7_248a_49ed_be14_95285f2c4400.slice/crio-d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe WatchSource:0}: Error finding container d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe: Status 404 returned error can't find the container with id d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.504724 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-spjzd" event={"ID":"57c14e51-5c0b-467c-ba79-ac6f39239445","Type":"ContainerStarted","Data":"02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989"} Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.506670 4983 generic.go:334] "Generic (PLEG): container finished" podID="6c811eb7-248a-49ed-be14-95285f2c4400" containerID="5bc308bbb2692ff07a3afcb9644b74f23c0b61727bee2ee07e82ad016a8bf8a0" exitCode=0 Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.506707 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" event={"ID":"6c811eb7-248a-49ed-be14-95285f2c4400","Type":"ContainerDied","Data":"5bc308bbb2692ff07a3afcb9644b74f23c0b61727bee2ee07e82ad016a8bf8a0"} Mar 16 00:30:01 crc kubenswrapper[4983]: I0316 00:30:01.506722 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" event={"ID":"6c811eb7-248a-49ed-be14-95285f2c4400","Type":"ContainerStarted","Data":"d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe"} Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.756676 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.796125 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") pod \"6c811eb7-248a-49ed-be14-95285f2c4400\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.796457 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") pod \"6c811eb7-248a-49ed-be14-95285f2c4400\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.796504 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") pod \"6c811eb7-248a-49ed-be14-95285f2c4400\" (UID: \"6c811eb7-248a-49ed-be14-95285f2c4400\") " Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.797518 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c811eb7-248a-49ed-be14-95285f2c4400" (UID: "6c811eb7-248a-49ed-be14-95285f2c4400"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.801958 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c811eb7-248a-49ed-be14-95285f2c4400" (UID: "6c811eb7-248a-49ed-be14-95285f2c4400"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.811922 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n" (OuterVolumeSpecName: "kube-api-access-7gm9n") pod "6c811eb7-248a-49ed-be14-95285f2c4400" (UID: "6c811eb7-248a-49ed-be14-95285f2c4400"). InnerVolumeSpecName "kube-api-access-7gm9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.898274 4983 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c811eb7-248a-49ed-be14-95285f2c4400-config-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.898311 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gm9n\" (UniqueName: \"kubernetes.io/projected/6c811eb7-248a-49ed-be14-95285f2c4400-kube-api-access-7gm9n\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:02 crc kubenswrapper[4983]: I0316 00:30:02.898321 4983 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c811eb7-248a-49ed-be14-95285f2c4400-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.522360 4983 generic.go:334] "Generic (PLEG): container finished" podID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerID="1accee3cbf491221453bc50f3bfc4fc15297e3a876200e21860d5dd4e3e66686" exitCode=0 Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.522546 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-spjzd" event={"ID":"57c14e51-5c0b-467c-ba79-ac6f39239445","Type":"ContainerDied","Data":"1accee3cbf491221453bc50f3bfc4fc15297e3a876200e21860d5dd4e3e66686"} Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.523637 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" event={"ID":"6c811eb7-248a-49ed-be14-95285f2c4400","Type":"ContainerDied","Data":"d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe"} Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.523664 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d9f08c357bfc7cc6ab9d59d919320302923a397b75bb7715706cd3168a8dfe" Mar 16 00:30:03 crc kubenswrapper[4983]: I0316 00:30:03.523713 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29560350-w9t6h" Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.759117 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.821468 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") pod \"57c14e51-5c0b-467c-ba79-ac6f39239445\" (UID: \"57c14e51-5c0b-467c-ba79-ac6f39239445\") " Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.826220 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6" (OuterVolumeSpecName: "kube-api-access-rrwz6") pod "57c14e51-5c0b-467c-ba79-ac6f39239445" (UID: "57c14e51-5c0b-467c-ba79-ac6f39239445"). InnerVolumeSpecName "kube-api-access-rrwz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:04 crc kubenswrapper[4983]: I0316 00:30:04.923102 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrwz6\" (UniqueName: \"kubernetes.io/projected/57c14e51-5c0b-467c-ba79-ac6f39239445-kube-api-access-rrwz6\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.539846 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560350-spjzd" event={"ID":"57c14e51-5c0b-467c-ba79-ac6f39239445","Type":"ContainerDied","Data":"02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989"} Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.539892 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02725799c1950b400c8e21a400391d33267171c7a94d9971a0d2c7afea9bd989" Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.539903 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560350-spjzd" Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.820789 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:30:05 crc kubenswrapper[4983]: I0316 00:30:05.824984 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560344-6jrbk"] Mar 16 00:30:06 crc kubenswrapper[4983]: I0316 00:30:06.100131 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159f5145-349d-4018-a8d2-251363a76196" path="/var/lib/kubelet/pods/159f5145-349d-4018-a8d2-251363a76196/volumes" Mar 16 00:30:06 crc kubenswrapper[4983]: I0316 00:30:06.760083 4983 scope.go:117] "RemoveContainer" containerID="12df34a0b20427d6c21f033a68c9272147f096ee9a9f4ed3b7d0b3054eb18184" Mar 16 00:30:14 crc kubenswrapper[4983]: I0316 00:30:14.596921 4983 generic.go:334] "Generic (PLEG): container finished" podID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerID="5f7cbddba7e24ee372140ec3e6d1283660f1e04ffafec527fdafc57d3a279e56" exitCode=0 Mar 16 00:30:14 crc kubenswrapper[4983]: I0316 00:30:14.597018 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"5f7cbddba7e24ee372140ec3e6d1283660f1e04ffafec527fdafc57d3a279e56"} Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.846806 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890158 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890226 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890265 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890291 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890328 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890363 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890392 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890421 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890446 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890473 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890505 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890528 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") pod \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\" (UID: \"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3\") " Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.890955 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.891175 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.891213 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.891494 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.892907 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.893525 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.894293 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.895775 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf" (OuterVolumeSpecName: "kube-api-access-dlgqf") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "kube-api-access-dlgqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.895798 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull" (OuterVolumeSpecName: "builder-dockercfg-88vdw-pull") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "builder-dockercfg-88vdw-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.896899 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push" (OuterVolumeSpecName: "builder-dockercfg-88vdw-push") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "builder-dockercfg-88vdw-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.990901 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991749 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991786 4983 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991795 4983 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991805 4983 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991813 4983 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991821 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-pull\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-pull\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991832 4983 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991840 4983 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991849 4983 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-88vdw-push\" (UniqueName: \"kubernetes.io/secret/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-builder-dockercfg-88vdw-push\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991857 4983 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:15 crc kubenswrapper[4983]: I0316 00:30:15.991866 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlgqf\" (UniqueName: \"kubernetes.io/projected/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-kube-api-access-dlgqf\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.624506 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"de1c6f9c-f134-479d-a5ae-5ab93b30b2e3","Type":"ContainerDied","Data":"dc3a9ca53d1d789e35b74d5f0511c33239a8145d878bf5432c485cec4d0b52c4"} Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.624554 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc3a9ca53d1d789e35b74d5f0511c33239a8145d878bf5432c485cec4d0b52c4" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.624608 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.760866 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" (UID: "de1c6f9c-f134-479d-a5ae-5ab93b30b2e3"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:30:16 crc kubenswrapper[4983]: I0316 00:30:16.802899 4983 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/de1c6f9c-f134-479d-a5ae-5ab93b30b2e3-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.498613 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-64877956d4-ljbdp"] Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499406 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="manage-dockerfile" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499420 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="manage-dockerfile" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499434 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerName="oc" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499441 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerName="oc" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499452 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="git-clone" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499459 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="git-clone" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499467 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="docker-build" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499473 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="docker-build" Mar 16 00:30:24 crc kubenswrapper[4983]: E0316 00:30:24.499489 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c811eb7-248a-49ed-be14-95285f2c4400" containerName="collect-profiles" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499496 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c811eb7-248a-49ed-be14-95285f2c4400" containerName="collect-profiles" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499630 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c811eb7-248a-49ed-be14-95285f2c4400" containerName="collect-profiles" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499647 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" containerName="oc" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.499660 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1c6f9c-f134-479d-a5ae-5ab93b30b2e3" containerName="docker-build" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.500159 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.502314 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-gpsrc" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.515673 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-64877956d4-ljbdp"] Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.692943 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-runner\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.693001 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtb6s\" (UniqueName: \"kubernetes.io/projected/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-kube-api-access-vtb6s\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.794041 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-runner\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.794291 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtb6s\" (UniqueName: \"kubernetes.io/projected/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-kube-api-access-vtb6s\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.794655 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-runner\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.816697 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtb6s\" (UniqueName: \"kubernetes.io/projected/5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d-kube-api-access-vtb6s\") pod \"smart-gateway-operator-64877956d4-ljbdp\" (UID: \"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d\") " pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:24 crc kubenswrapper[4983]: I0316 00:30:24.818356 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" Mar 16 00:30:25 crc kubenswrapper[4983]: I0316 00:30:25.321287 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-64877956d4-ljbdp"] Mar 16 00:30:25 crc kubenswrapper[4983]: W0316 00:30:25.329573 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c3140b4_67ae_4012_bd8b_9cecbcb4ff4d.slice/crio-0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58 WatchSource:0}: Error finding container 0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58: Status 404 returned error can't find the container with id 0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58 Mar 16 00:30:25 crc kubenswrapper[4983]: I0316 00:30:25.678119 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" event={"ID":"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d","Type":"ContainerStarted","Data":"0e1782a6c96d0af0091b346a4b2a0ea56e6c06b786cb5ce795e42cd398cbce58"} Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.292342 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-65fdb44596-qnp9k"] Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.293596 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.296331 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-q6nrk" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.327213 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-65fdb44596-qnp9k"] Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.449188 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/b8f4edcf-0403-4d59-b045-e618c6aabff5-kube-api-access-96k2g\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.449244 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b8f4edcf-0403-4d59-b045-e618c6aabff5-runner\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.550231 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/b8f4edcf-0403-4d59-b045-e618c6aabff5-kube-api-access-96k2g\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.550279 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b8f4edcf-0403-4d59-b045-e618c6aabff5-runner\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.551066 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b8f4edcf-0403-4d59-b045-e618c6aabff5-runner\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.569669 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96k2g\" (UniqueName: \"kubernetes.io/projected/b8f4edcf-0403-4d59-b045-e618c6aabff5-kube-api-access-96k2g\") pod \"service-telemetry-operator-65fdb44596-qnp9k\" (UID: \"b8f4edcf-0403-4d59-b045-e618c6aabff5\") " pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:28 crc kubenswrapper[4983]: I0316 00:30:28.620661 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" Mar 16 00:30:35 crc kubenswrapper[4983]: I0316 00:30:35.294976 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-65fdb44596-qnp9k"] Mar 16 00:30:36 crc kubenswrapper[4983]: W0316 00:30:36.635305 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8f4edcf_0403_4d59_b045_e618c6aabff5.slice/crio-c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634 WatchSource:0}: Error finding container c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634: Status 404 returned error can't find the container with id c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634 Mar 16 00:30:36 crc kubenswrapper[4983]: I0316 00:30:36.751807 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" event={"ID":"b8f4edcf-0403-4d59-b045-e618c6aabff5","Type":"ContainerStarted","Data":"c233c6c0ae186025d223961e27523063970ef3f5f2ca414126761bf4d6844634"} Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.033676 4983 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.034173 4983 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773621017,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vtb6s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-64877956d4-ljbdp_service-telemetry(5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.035442 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" podUID="5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d" Mar 16 00:30:40 crc kubenswrapper[4983]: E0316 00:30:40.785653 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" podUID="5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.864854 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.868072 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.868394 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.946538 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.946583 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:42 crc kubenswrapper[4983]: I0316 00:30:42.946608 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047379 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047703 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047740 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.047893 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.048092 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.067906 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"redhat-operators-6ztth\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:43 crc kubenswrapper[4983]: I0316 00:30:43.189438 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.668570 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.806434 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff"} Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.806683 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"c2f45a75c848cc0c542208a6f9b28b0dde0446c369790008bc4a1b6a2fd51cfb"} Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.810377 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" event={"ID":"b8f4edcf-0403-4d59-b045-e618c6aabff5","Type":"ContainerStarted","Data":"9543f2b0720f02916d77fb37da1892affb82983543db57540f3d393228b0e571"} Mar 16 00:30:44 crc kubenswrapper[4983]: I0316 00:30:44.844385 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-65fdb44596-qnp9k" podStartSLOduration=8.933393343 podStartE2EDuration="16.844365667s" podCreationTimestamp="2026-03-16 00:30:28 +0000 UTC" firstStartedPulling="2026-03-16 00:30:36.645214473 +0000 UTC m=+1445.245312923" lastFinishedPulling="2026-03-16 00:30:44.556186817 +0000 UTC m=+1453.156285247" observedRunningTime="2026-03-16 00:30:44.838982543 +0000 UTC m=+1453.439080973" watchObservedRunningTime="2026-03-16 00:30:44.844365667 +0000 UTC m=+1453.444464097" Mar 16 00:30:45 crc kubenswrapper[4983]: I0316 00:30:45.817385 4983 generic.go:334] "Generic (PLEG): container finished" podID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" exitCode=0 Mar 16 00:30:45 crc kubenswrapper[4983]: I0316 00:30:45.817436 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff"} Mar 16 00:30:47 crc kubenswrapper[4983]: I0316 00:30:47.831437 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df"} Mar 16 00:30:48 crc kubenswrapper[4983]: I0316 00:30:48.838474 4983 generic.go:334] "Generic (PLEG): container finished" podID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" exitCode=0 Mar 16 00:30:48 crc kubenswrapper[4983]: I0316 00:30:48.838564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df"} Mar 16 00:30:49 crc kubenswrapper[4983]: I0316 00:30:49.846517 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerStarted","Data":"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a"} Mar 16 00:30:49 crc kubenswrapper[4983]: I0316 00:30:49.867943 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ztth" podStartSLOduration=4.239481924 podStartE2EDuration="7.867926095s" podCreationTimestamp="2026-03-16 00:30:42 +0000 UTC" firstStartedPulling="2026-03-16 00:30:45.818705596 +0000 UTC m=+1454.418804026" lastFinishedPulling="2026-03-16 00:30:49.447149767 +0000 UTC m=+1458.047248197" observedRunningTime="2026-03-16 00:30:49.863495197 +0000 UTC m=+1458.463593627" watchObservedRunningTime="2026-03-16 00:30:49.867926095 +0000 UTC m=+1458.468024515" Mar 16 00:30:53 crc kubenswrapper[4983]: I0316 00:30:53.202435 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:53 crc kubenswrapper[4983]: I0316 00:30:53.202938 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:30:54 crc kubenswrapper[4983]: I0316 00:30:54.258645 4983 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6ztth" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" probeResult="failure" output=< Mar 16 00:30:54 crc kubenswrapper[4983]: timeout: failed to connect service ":50051" within 1s Mar 16 00:30:54 crc kubenswrapper[4983]: > Mar 16 00:30:54 crc kubenswrapper[4983]: I0316 00:30:54.881635 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" event={"ID":"5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d","Type":"ContainerStarted","Data":"475cbfbd9b8abe2a8b2f00a7f8c8062d6047c6f9d50e30e6165d3bc7beb1a7af"} Mar 16 00:30:54 crc kubenswrapper[4983]: I0316 00:30:54.899787 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-64877956d4-ljbdp" podStartSLOduration=2.439470684 podStartE2EDuration="30.899770104s" podCreationTimestamp="2026-03-16 00:30:24 +0000 UTC" firstStartedPulling="2026-03-16 00:30:25.331584002 +0000 UTC m=+1433.931682432" lastFinishedPulling="2026-03-16 00:30:53.791883422 +0000 UTC m=+1462.391981852" observedRunningTime="2026-03-16 00:30:54.897874873 +0000 UTC m=+1463.497973293" watchObservedRunningTime="2026-03-16 00:30:54.899770104 +0000 UTC m=+1463.499868534" Mar 16 00:31:03 crc kubenswrapper[4983]: I0316 00:31:03.243081 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:03 crc kubenswrapper[4983]: I0316 00:31:03.321003 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:03 crc kubenswrapper[4983]: I0316 00:31:03.487353 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:31:04 crc kubenswrapper[4983]: I0316 00:31:04.947640 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6ztth" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" containerID="cri-o://5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" gracePeriod=2 Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.090387 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.091724 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.096288 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.096500 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.096832 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097048 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097192 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097205 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-kdvm9" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.097381 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.113590 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249032 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249116 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249177 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249228 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249286 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.249307 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350476 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350588 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350638 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350665 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.350707 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.352529 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.353207 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.353293 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.354045 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.358630 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.358635 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.358635 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.369401 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.369523 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.372446 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"default-interconnect-68864d46cb-p4tqx\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.421943 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.454382 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") pod \"655a4c8a-248d-4630-889a-0932c2c2f2b9\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.454469 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") pod \"655a4c8a-248d-4630-889a-0932c2c2f2b9\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.454574 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") pod \"655a4c8a-248d-4630-889a-0932c2c2f2b9\" (UID: \"655a4c8a-248d-4630-889a-0932c2c2f2b9\") " Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.456185 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities" (OuterVolumeSpecName: "utilities") pod "655a4c8a-248d-4630-889a-0932c2c2f2b9" (UID: "655a4c8a-248d-4630-889a-0932c2c2f2b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.457857 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9" (OuterVolumeSpecName: "kube-api-access-dt6r9") pod "655a4c8a-248d-4630-889a-0932c2c2f2b9" (UID: "655a4c8a-248d-4630-889a-0932c2c2f2b9"). InnerVolumeSpecName "kube-api-access-dt6r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.556039 4983 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.556098 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt6r9\" (UniqueName: \"kubernetes.io/projected/655a4c8a-248d-4630-889a-0932c2c2f2b9-kube-api-access-dt6r9\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.599595 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "655a4c8a-248d-4630-889a-0932c2c2f2b9" (UID: "655a4c8a-248d-4630-889a-0932c2c2f2b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.601780 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:31:05 crc kubenswrapper[4983]: W0316 00:31:05.606518 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e69fb8_91f6_4bfc_b8a5_2f9e77922ac3.slice/crio-914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8 WatchSource:0}: Error finding container 914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8: Status 404 returned error can't find the container with id 914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8 Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.657476 4983 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/655a4c8a-248d-4630-889a-0932c2c2f2b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957303 4983 generic.go:334] "Generic (PLEG): container finished" podID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" exitCode=0 Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957379 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a"} Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957410 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ztth" event={"ID":"655a4c8a-248d-4630-889a-0932c2c2f2b9","Type":"ContainerDied","Data":"c2f45a75c848cc0c542208a6f9b28b0dde0446c369790008bc4a1b6a2fd51cfb"} Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957430 4983 scope.go:117] "RemoveContainer" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.957585 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ztth" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.965138 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerStarted","Data":"914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8"} Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.981320 4983 scope.go:117] "RemoveContainer" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" Mar 16 00:31:05 crc kubenswrapper[4983]: I0316 00:31:05.996919 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.003312 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6ztth"] Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.006792 4983 scope.go:117] "RemoveContainer" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.024671 4983 scope.go:117] "RemoveContainer" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" Mar 16 00:31:06 crc kubenswrapper[4983]: E0316 00:31:06.025155 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a\": container with ID starting with 5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a not found: ID does not exist" containerID="5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025195 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a"} err="failed to get container status \"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a\": rpc error: code = NotFound desc = could not find container \"5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a\": container with ID starting with 5bbf4a2332ff8b5f6e83c2e5ba222ba17d636c92ad509af2a86a4ff5dcc6f81a not found: ID does not exist" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025225 4983 scope.go:117] "RemoveContainer" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" Mar 16 00:31:06 crc kubenswrapper[4983]: E0316 00:31:06.025547 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df\": container with ID starting with b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df not found: ID does not exist" containerID="b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025580 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df"} err="failed to get container status \"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df\": rpc error: code = NotFound desc = could not find container \"b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df\": container with ID starting with b7d1e1b670ad74f5479e5f87a0107719ccc3f1edf5fa8d10e0a2f1b6624ef1df not found: ID does not exist" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025600 4983 scope.go:117] "RemoveContainer" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" Mar 16 00:31:06 crc kubenswrapper[4983]: E0316 00:31:06.025885 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff\": container with ID starting with ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff not found: ID does not exist" containerID="ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.025917 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff"} err="failed to get container status \"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff\": rpc error: code = NotFound desc = could not find container \"ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff\": container with ID starting with ef421b36e7486cfb12f38992dd392aca8b20360533275387e2f003ee780544ff not found: ID does not exist" Mar 16 00:31:06 crc kubenswrapper[4983]: I0316 00:31:06.105233 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" path="/var/lib/kubelet/pods/655a4c8a-248d-4630-889a-0932c2c2f2b9/volumes" Mar 16 00:31:12 crc kubenswrapper[4983]: I0316 00:31:12.003799 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerStarted","Data":"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235"} Mar 16 00:31:12 crc kubenswrapper[4983]: I0316 00:31:12.022244 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" podStartSLOduration=1.608243621 podStartE2EDuration="7.022224227s" podCreationTimestamp="2026-03-16 00:31:05 +0000 UTC" firstStartedPulling="2026-03-16 00:31:05.610577239 +0000 UTC m=+1474.210675669" lastFinishedPulling="2026-03-16 00:31:11.024557815 +0000 UTC m=+1479.624656275" observedRunningTime="2026-03-16 00:31:12.019652698 +0000 UTC m=+1480.619751128" watchObservedRunningTime="2026-03-16 00:31:12.022224227 +0000 UTC m=+1480.622322657" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.319818 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.320470 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320489 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.320526 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-utilities" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320539 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-utilities" Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.320564 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-content" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320576 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="extract-content" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.320784 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="655a4c8a-248d-4630-889a-0932c2c2f2b9" containerName="registry-server" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.322427 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325138 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325337 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325581 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325771 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.325935 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.326064 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.326211 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.326373 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.328383 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.337267 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.338500 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-hpv2q" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501527 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-tls-assets\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501579 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501624 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501855 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501911 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.501965 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502046 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c952d0a-6462-4081-8603-935847aefe14-config-out\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502084 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502125 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-web-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502200 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.502227 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmjlh\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-kube-api-access-dmjlh\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603374 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603437 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603473 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603509 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603540 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603578 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c952d0a-6462-4081-8603-935847aefe14-config-out\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603612 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603648 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-web-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.603667 4983 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:31:15 crc kubenswrapper[4983]: E0316 00:31:15.603807 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls podName:4c952d0a-6462-4081-8603-935847aefe14 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:16.103735275 +0000 UTC m=+1484.703833715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "4c952d0a-6462-4081-8603-935847aefe14") : secret "default-prometheus-proxy-tls" not found Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.603682 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604144 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmjlh\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-kube-api-access-dmjlh\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604280 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-tls-assets\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604326 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604394 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604610 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.604948 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.605318 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/4c952d0a-6462-4081-8603-935847aefe14-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610160 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-web-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610254 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-config\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610327 4983 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610508 4983 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d26c15a88fbfc712781200258d29c496371f2e856d0ee515a41c781249c487a5/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.610529 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-tls-assets\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.611676 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.618960 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4c952d0a-6462-4081-8603-935847aefe14-config-out\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.628685 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmjlh\" (UniqueName: \"kubernetes.io/projected/4c952d0a-6462-4081-8603-935847aefe14-kube-api-access-dmjlh\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:15 crc kubenswrapper[4983]: I0316 00:31:15.656966 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-45c029b0-a3f9-42ac-97ec-90bbd18c6387\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:16 crc kubenswrapper[4983]: I0316 00:31:16.112192 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:16 crc kubenswrapper[4983]: E0316 00:31:16.112393 4983 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 16 00:31:16 crc kubenswrapper[4983]: E0316 00:31:16.112465 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls podName:4c952d0a-6462-4081-8603-935847aefe14 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:17.112446009 +0000 UTC m=+1485.712544449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "4c952d0a-6462-4081-8603-935847aefe14") : secret "default-prometheus-proxy-tls" not found Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.125510 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.132603 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c952d0a-6462-4081-8603-935847aefe14-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"4c952d0a-6462-4081-8603-935847aefe14\") " pod="service-telemetry/prometheus-default-0" Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.156102 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:17 crc kubenswrapper[4983]: I0316 00:31:17.559190 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 16 00:31:17 crc kubenswrapper[4983]: W0316 00:31:17.581911 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c952d0a_6462_4081_8603_935847aefe14.slice/crio-f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f WatchSource:0}: Error finding container f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f: Status 404 returned error can't find the container with id f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f Mar 16 00:31:18 crc kubenswrapper[4983]: I0316 00:31:18.049741 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"f2154d67f0d902a913fa5c0f9f59f38ed9c42f8d7c69d3bdb5686fca4f3dc16f"} Mar 16 00:31:22 crc kubenswrapper[4983]: I0316 00:31:22.082466 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"a4be303f3d3093e70f5251c499293adcd190f43b473540357b97c4147dad99f7"} Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.083559 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4h8qf"] Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.085369 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.095975 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4h8qf"] Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.239530 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z8b4\" (UniqueName: \"kubernetes.io/projected/1f2a95c7-282e-4200-ac63-1a114726205b-kube-api-access-5z8b4\") pod \"default-snmp-webhook-6856cfb745-4h8qf\" (UID: \"1f2a95c7-282e-4200-ac63-1a114726205b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.341115 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z8b4\" (UniqueName: \"kubernetes.io/projected/1f2a95c7-282e-4200-ac63-1a114726205b-kube-api-access-5z8b4\") pod \"default-snmp-webhook-6856cfb745-4h8qf\" (UID: \"1f2a95c7-282e-4200-ac63-1a114726205b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.374038 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z8b4\" (UniqueName: \"kubernetes.io/projected/1f2a95c7-282e-4200-ac63-1a114726205b-kube-api-access-5z8b4\") pod \"default-snmp-webhook-6856cfb745-4h8qf\" (UID: \"1f2a95c7-282e-4200-ac63-1a114726205b\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.403410 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" Mar 16 00:31:25 crc kubenswrapper[4983]: I0316 00:31:25.635858 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4h8qf"] Mar 16 00:31:25 crc kubenswrapper[4983]: W0316 00:31:25.656199 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f2a95c7_282e_4200_ac63_1a114726205b.slice/crio-36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9 WatchSource:0}: Error finding container 36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9: Status 404 returned error can't find the container with id 36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9 Mar 16 00:31:26 crc kubenswrapper[4983]: I0316 00:31:26.127564 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" event={"ID":"1f2a95c7-282e-4200-ac63-1a114726205b","Type":"ContainerStarted","Data":"36f9dc083475ccd68eae41ae2165e36d12e11efa792baea2630c7e3d0614a8c9"} Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.956506 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.959172 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964110 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964300 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-lptq9" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964198 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964300 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.964497 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.965013 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:31:28 crc kubenswrapper[4983]: I0316 00:31:28.969549 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099563 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099608 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jpw8\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-kube-api-access-4jpw8\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099653 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-config-volume\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099741 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099778 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099806 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099825 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099860 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-web-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.099880 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f21bd8c3-505c-465a-afeb-404a9136ea58-config-out\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.149468 4983 generic.go:334] "Generic (PLEG): container finished" podID="4c952d0a-6462-4081-8603-935847aefe14" containerID="a4be303f3d3093e70f5251c499293adcd190f43b473540357b97c4147dad99f7" exitCode=0 Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.149522 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerDied","Data":"a4be303f3d3093e70f5251c499293adcd190f43b473540357b97c4147dad99f7"} Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201214 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f21bd8c3-505c-465a-afeb-404a9136ea58-config-out\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201289 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201317 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jpw8\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-kube-api-access-4jpw8\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201362 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-config-volume\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201384 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201406 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201425 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201447 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.201566 4983 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.201618 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls podName:f21bd8c3-505c-465a-afeb-404a9136ea58 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:29.701601708 +0000 UTC m=+1498.301700138 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f21bd8c3-505c-465a-afeb-404a9136ea58") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.201797 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-web-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.206968 4983 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.207164 4983 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be416d59a4e641c720596f27e26bd8c509ca92bf347a2cb40479f09ccc772acf/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.207426 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-tls-assets\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.207439 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.208379 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-web-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.208397 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f21bd8c3-505c-465a-afeb-404a9136ea58-config-out\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.209028 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-config-volume\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.212553 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.227735 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jpw8\" (UniqueName: \"kubernetes.io/projected/f21bd8c3-505c-465a-afeb-404a9136ea58-kube-api-access-4jpw8\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.239095 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a35c1573-8441-4092-8560-86f7726028dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a35c1573-8441-4092-8560-86f7726028dc\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: I0316 00:31:29.709002 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.709209 4983 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:29 crc kubenswrapper[4983]: E0316 00:31:29.709310 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls podName:f21bd8c3-505c-465a-afeb-404a9136ea58 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:30.709291995 +0000 UTC m=+1499.309390425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f21bd8c3-505c-465a-afeb-404a9136ea58") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:30 crc kubenswrapper[4983]: I0316 00:31:30.722308 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:30 crc kubenswrapper[4983]: E0316 00:31:30.722541 4983 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:30 crc kubenswrapper[4983]: E0316 00:31:30.722740 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls podName:f21bd8c3-505c-465a-afeb-404a9136ea58 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:32.722719977 +0000 UTC m=+1501.322818407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "f21bd8c3-505c-465a-afeb-404a9136ea58") : secret "default-alertmanager-proxy-tls" not found Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.752328 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.778494 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f21bd8c3-505c-465a-afeb-404a9136ea58-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"f21bd8c3-505c-465a-afeb-404a9136ea58\") " pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.893896 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-lptq9" Mar 16 00:31:32 crc kubenswrapper[4983]: I0316 00:31:32.902589 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 16 00:31:33 crc kubenswrapper[4983]: I0316 00:31:33.174054 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" event={"ID":"1f2a95c7-282e-4200-ac63-1a114726205b","Type":"ContainerStarted","Data":"be75eb3523134a08101e6f9f2c283897f22428d514bb003660b28d3d65d781db"} Mar 16 00:31:33 crc kubenswrapper[4983]: I0316 00:31:33.188745 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-4h8qf" podStartSLOduration=1.270071024 podStartE2EDuration="8.18873036s" podCreationTimestamp="2026-03-16 00:31:25 +0000 UTC" firstStartedPulling="2026-03-16 00:31:25.657989791 +0000 UTC m=+1494.258088241" lastFinishedPulling="2026-03-16 00:31:32.576649147 +0000 UTC m=+1501.176747577" observedRunningTime="2026-03-16 00:31:33.187641721 +0000 UTC m=+1501.787740151" watchObservedRunningTime="2026-03-16 00:31:33.18873036 +0000 UTC m=+1501.788828790" Mar 16 00:31:33 crc kubenswrapper[4983]: I0316 00:31:33.372147 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 16 00:31:34 crc kubenswrapper[4983]: I0316 00:31:34.181998 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"5533756ed7ab22efdb85b446e09f294bae54d82deec5fe73e3c9c19b6ba14572"} Mar 16 00:31:35 crc kubenswrapper[4983]: I0316 00:31:35.195224 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"2c52f71e3a3618defbe3cada2f72b680a7ee28c3d43c22c86d95d1f72f9090c0"} Mar 16 00:31:37 crc kubenswrapper[4983]: I0316 00:31:37.210515 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"de037c5a808a8b93cfe8e1480355cbc928d4ee7e260f061ff9177e929bfdbeeb"} Mar 16 00:31:40 crc kubenswrapper[4983]: I0316 00:31:40.233647 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"12824209a1723cb781c2e7089d7056063a2a17a195feeea948f170be7f02a146"} Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.240897 4983 generic.go:334] "Generic (PLEG): container finished" podID="f21bd8c3-505c-465a-afeb-404a9136ea58" containerID="2c52f71e3a3618defbe3cada2f72b680a7ee28c3d43c22c86d95d1f72f9090c0" exitCode=0 Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.240940 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerDied","Data":"2c52f71e3a3618defbe3cada2f72b680a7ee28c3d43c22c86d95d1f72f9090c0"} Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.491843 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82"] Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.493547 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.497743 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.498552 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-qksbs" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.498605 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.498658 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.508658 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82"] Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578216 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578298 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578319 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b5da06a-5282-43dc-b876-76eb99ba6f9d-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578351 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmgt\" (UniqueName: \"kubernetes.io/projected/1b5da06a-5282-43dc-b876-76eb99ba6f9d-kube-api-access-cjmgt\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.578370 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b5da06a-5282-43dc-b876-76eb99ba6f9d-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.679655 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680783 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680832 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b5da06a-5282-43dc-b876-76eb99ba6f9d-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680882 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmgt\" (UniqueName: \"kubernetes.io/projected/1b5da06a-5282-43dc-b876-76eb99ba6f9d-kube-api-access-cjmgt\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.680909 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b5da06a-5282-43dc-b876-76eb99ba6f9d-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.681339 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b5da06a-5282-43dc-b876-76eb99ba6f9d-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: E0316 00:31:41.681405 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:41 crc kubenswrapper[4983]: E0316 00:31:41.681463 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls podName:1b5da06a-5282-43dc-b876-76eb99ba6f9d nodeName:}" failed. No retries permitted until 2026-03-16 00:31:42.181448318 +0000 UTC m=+1510.781546748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" (UID: "1b5da06a-5282-43dc-b876-76eb99ba6f9d") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.681908 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b5da06a-5282-43dc-b876-76eb99ba6f9d-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.696435 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:41 crc kubenswrapper[4983]: I0316 00:31:41.707550 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmgt\" (UniqueName: \"kubernetes.io/projected/1b5da06a-5282-43dc-b876-76eb99ba6f9d-kube-api-access-cjmgt\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:42 crc kubenswrapper[4983]: I0316 00:31:42.191495 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:42 crc kubenswrapper[4983]: E0316 00:31:42.191787 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:42 crc kubenswrapper[4983]: E0316 00:31:42.192387 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls podName:1b5da06a-5282-43dc-b876-76eb99ba6f9d nodeName:}" failed. No retries permitted until 2026-03-16 00:31:43.192360091 +0000 UTC m=+1511.792458521 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" (UID: "1b5da06a-5282-43dc-b876-76eb99ba6f9d") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.212845 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.223199 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1b5da06a-5282-43dc-b876-76eb99ba6f9d-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82\" (UID: \"1b5da06a-5282-43dc-b876-76eb99ba6f9d\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.319058 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" Mar 16 00:31:43 crc kubenswrapper[4983]: I0316 00:31:43.778243 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82"] Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.126953 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf"] Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.129725 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.133865 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.133886 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.138660 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf"] Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227476 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227541 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzp5z\" (UniqueName: \"kubernetes.io/projected/c6f00393-4848-47e5-8836-3e3b9c3a5b95-kube-api-access-mzp5z\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227656 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227683 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6f00393-4848-47e5-8836-3e3b9c3a5b95-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.227712 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c6f00393-4848-47e5-8836-3e3b9c3a5b95-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.265126 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"df64c22da3a2e39417b17fad62d7e7fb95f18d4d0420be571ad099b7cfe6a4f1"} Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328381 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6f00393-4848-47e5-8836-3e3b9c3a5b95-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328426 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c6f00393-4848-47e5-8836-3e3b9c3a5b95-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328480 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328508 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzp5z\" (UniqueName: \"kubernetes.io/projected/c6f00393-4848-47e5-8836-3e3b9c3a5b95-kube-api-access-mzp5z\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.328559 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.329381 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.329456 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls podName:c6f00393-4848-47e5-8836-3e3b9c3a5b95 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:44.829434307 +0000 UTC m=+1513.429532737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" (UID: "c6f00393-4848-47e5-8836-3e3b9c3a5b95") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.329664 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c6f00393-4848-47e5-8836-3e3b9c3a5b95-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.330312 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c6f00393-4848-47e5-8836-3e3b9c3a5b95-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.333886 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.347586 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzp5z\" (UniqueName: \"kubernetes.io/projected/c6f00393-4848-47e5-8836-3e3b9c3a5b95-kube-api-access-mzp5z\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: I0316 00:31:44.836361 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.836552 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:44 crc kubenswrapper[4983]: E0316 00:31:44.836965 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls podName:c6f00393-4848-47e5-8836-3e3b9c3a5b95 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:45.836941848 +0000 UTC m=+1514.437040288 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" (UID: "c6f00393-4848-47e5-8836-3e3b9c3a5b95") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 16 00:31:45 crc kubenswrapper[4983]: I0316 00:31:45.852582 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:45 crc kubenswrapper[4983]: I0316 00:31:45.858623 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/c6f00393-4848-47e5-8836-3e3b9c3a5b95-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf\" (UID: \"c6f00393-4848-47e5-8836-3e3b9c3a5b95\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:45 crc kubenswrapper[4983]: I0316 00:31:45.956634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" Mar 16 00:31:46 crc kubenswrapper[4983]: I0316 00:31:46.490130 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf"] Mar 16 00:31:49 crc kubenswrapper[4983]: I0316 00:31:49.309684 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"8bd030eca22010c76446e25eb2fc48acfa8c5fac806edeae1a7316946c3d9e94"} Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.383895 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9"] Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.385867 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.391590 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.391793 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.393520 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9"] Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.522916 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f90f8a5e-67de-4058-9e42-0caf957b6b71-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523021 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldcq\" (UniqueName: \"kubernetes.io/projected/f90f8a5e-67de-4058-9e42-0caf957b6b71-kube-api-access-gldcq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523059 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f90f8a5e-67de-4058-9e42-0caf957b6b71-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523152 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.523212 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624098 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gldcq\" (UniqueName: \"kubernetes.io/projected/f90f8a5e-67de-4058-9e42-0caf957b6b71-kube-api-access-gldcq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624182 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f90f8a5e-67de-4058-9e42-0caf957b6b71-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624256 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624304 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624343 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f90f8a5e-67de-4058-9e42-0caf957b6b71-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: E0316 00:31:50.624547 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:50 crc kubenswrapper[4983]: E0316 00:31:50.624627 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls podName:f90f8a5e-67de-4058-9e42-0caf957b6b71 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:51.124611325 +0000 UTC m=+1519.724709845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" (UID: "f90f8a5e-67de-4058-9e42-0caf957b6b71") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.624864 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f90f8a5e-67de-4058-9e42-0caf957b6b71-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.626992 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f90f8a5e-67de-4058-9e42-0caf957b6b71-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.630419 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:50 crc kubenswrapper[4983]: I0316 00:31:50.641266 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldcq\" (UniqueName: \"kubernetes.io/projected/f90f8a5e-67de-4058-9e42-0caf957b6b71-kube-api-access-gldcq\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.131572 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:51 crc kubenswrapper[4983]: E0316 00:31:51.131976 4983 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:51 crc kubenswrapper[4983]: E0316 00:31:51.132071 4983 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls podName:f90f8a5e-67de-4058-9e42-0caf957b6b71 nodeName:}" failed. No retries permitted until 2026-03-16 00:31:52.132046546 +0000 UTC m=+1520.732144976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" (UID: "f90f8a5e-67de-4058-9e42-0caf957b6b71") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.334995 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"4c952d0a-6462-4081-8603-935847aefe14","Type":"ContainerStarted","Data":"588319b4fba0eb4322bbf9e60c60f0d283ea764278fa05a6a525acbc14c99fbd"} Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.342375 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"72f9951154030ce2edcf04af07553d25087ad4a56d8d1c37f77bd72ddbca233b"} Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.345047 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"8015ab262caa609b03d18e295501cedb02b1d7eb16855c09aae9b5749ed3baed"} Mar 16 00:31:51 crc kubenswrapper[4983]: I0316 00:31:51.370126 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=4.010286671 podStartE2EDuration="37.370112309s" podCreationTimestamp="2026-03-16 00:31:14 +0000 UTC" firstStartedPulling="2026-03-16 00:31:17.583649896 +0000 UTC m=+1486.183748326" lastFinishedPulling="2026-03-16 00:31:50.943475544 +0000 UTC m=+1519.543573964" observedRunningTime="2026-03-16 00:31:51.369238195 +0000 UTC m=+1519.969336625" watchObservedRunningTime="2026-03-16 00:31:51.370112309 +0000 UTC m=+1519.970210739" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.148354 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.155235 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f90f8a5e-67de-4058-9e42-0caf957b6b71-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9\" (UID: \"f90f8a5e-67de-4058-9e42-0caf957b6b71\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.156704 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 16 00:31:52 crc kubenswrapper[4983]: I0316 00:31:52.211888 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" Mar 16 00:31:53 crc kubenswrapper[4983]: I0316 00:31:53.350607 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9"] Mar 16 00:31:53 crc kubenswrapper[4983]: I0316 00:31:53.448492 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:31:53 crc kubenswrapper[4983]: I0316 00:31:53.449120 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.391471 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"c517c932658f217915b64832431b2f1667ae56507da615ca5f0b78adca4bb3e9"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.413191 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.413238 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"93baf073698f37ccdc048dccbc446c7dc8f440d38e9cecc4747f7232434b84ea"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.413248 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"f0eb1123813d7cd48be0a4e4e53b889743af2cc9ad2a2317f163bc2779f92851"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.418809 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63"} Mar 16 00:31:54 crc kubenswrapper[4983]: I0316 00:31:54.422183 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6"} Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.079844 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79"] Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.081020 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.084257 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.084487 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.092849 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79"] Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.203682 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.203975 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.204147 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.204275 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98h7r\" (UniqueName: \"kubernetes.io/projected/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-kube-api-access-98h7r\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.306161 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307060 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307094 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98h7r\" (UniqueName: \"kubernetes.io/projected/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-kube-api-access-98h7r\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307178 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.307538 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.308122 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.313317 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.323508 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98h7r\" (UniqueName: \"kubernetes.io/projected/c395c954-b7f5-4ec0-be3d-29c8cec19fb1-kube-api-access-98h7r\") pod \"default-cloud1-coll-event-smartgateway-7cd756799d-8ld79\" (UID: \"c395c954-b7f5-4ec0-be3d-29c8cec19fb1\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:55 crc kubenswrapper[4983]: I0316 00:31:55.407091 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.154543 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v"] Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.155937 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.159069 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.163485 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v"] Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227373 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lwnw\" (UniqueName: \"kubernetes.io/projected/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-kube-api-access-8lwnw\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227552 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227658 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.227787 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329162 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329218 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329256 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.329311 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lwnw\" (UniqueName: \"kubernetes.io/projected/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-kube-api-access-8lwnw\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.330046 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.330377 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.338647 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.345297 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lwnw\" (UniqueName: \"kubernetes.io/projected/e4d42cf2-f1fd-4aa7-b950-0d60911c50af-kube-api-access-8lwnw\") pod \"default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v\" (UID: \"e4d42cf2-f1fd-4aa7-b950-0d60911c50af\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.450067 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"9aed0ed8df66e785bc9d5d66f4079e1e38b7174bef7af9b734679dda61f08c58"} Mar 16 00:31:56 crc kubenswrapper[4983]: I0316 00:31:56.484134 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.423502 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v"] Mar 16 00:31:59 crc kubenswrapper[4983]: W0316 00:31:59.432120 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4d42cf2_f1fd_4aa7_b950_0d60911c50af.slice/crio-d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233 WatchSource:0}: Error finding container d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233: Status 404 returned error can't find the container with id d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233 Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.506179 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"f7840ecdf4fcdf60af5f4eb8211546c563ed5fb0e52ea433c19b1d3a98a46270"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.509179 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"0aa8a0dec936622d4540fb518c56ffa469d81190d27e045f40b84e9b81140e2b"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.510481 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"d0e051ca78b302732e8e8f63a4d7f022c9f8e966621cb79720eb2f037144d233"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.511659 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"ca9a8afe432f12da01937c3d80e779bbcdd39612200bd7c66290ae71a8c956f2"} Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.536468 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" podStartSLOduration=5.069774346 podStartE2EDuration="15.536442357s" podCreationTimestamp="2026-03-16 00:31:44 +0000 UTC" firstStartedPulling="2026-03-16 00:31:48.675991858 +0000 UTC m=+1517.276090288" lastFinishedPulling="2026-03-16 00:31:59.142659869 +0000 UTC m=+1527.742758299" observedRunningTime="2026-03-16 00:31:59.528201247 +0000 UTC m=+1528.128299687" watchObservedRunningTime="2026-03-16 00:31:59.536442357 +0000 UTC m=+1528.136540787" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.579521 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" podStartSLOduration=3.7774968859999998 podStartE2EDuration="9.579500036s" podCreationTimestamp="2026-03-16 00:31:50 +0000 UTC" firstStartedPulling="2026-03-16 00:31:53.364025053 +0000 UTC m=+1521.964123483" lastFinishedPulling="2026-03-16 00:31:59.166028203 +0000 UTC m=+1527.766126633" observedRunningTime="2026-03-16 00:31:59.558076274 +0000 UTC m=+1528.158174704" watchObservedRunningTime="2026-03-16 00:31:59.579500036 +0000 UTC m=+1528.179598466" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.584602 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" podStartSLOduration=3.2272827 podStartE2EDuration="18.584576831s" podCreationTimestamp="2026-03-16 00:31:41 +0000 UTC" firstStartedPulling="2026-03-16 00:31:43.781936207 +0000 UTC m=+1512.382034647" lastFinishedPulling="2026-03-16 00:31:59.139230348 +0000 UTC m=+1527.739328778" observedRunningTime="2026-03-16 00:31:59.582052364 +0000 UTC m=+1528.182150794" watchObservedRunningTime="2026-03-16 00:31:59.584576831 +0000 UTC m=+1528.184675281" Mar 16 00:31:59 crc kubenswrapper[4983]: I0316 00:31:59.716165 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79"] Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.158086 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.159271 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.161188 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.161353 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.161470 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.167306 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.198013 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"auto-csr-approver-29560352-h2jj5\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.300020 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"auto-csr-approver-29560352-h2jj5\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.331560 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"auto-csr-approver-29560352-h2jj5\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.474862 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.523496 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"8820119474c4b4aaeaf00a16224f6b12b6a7323d3aa4cea1c91c92b863c312e2"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.523539 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.523548 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"7d26c3e5619765fa1e99d6abcec28a53dc87906630a769d555889ac29a33178f"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.529573 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"f21bd8c3-505c-465a-afeb-404a9136ea58","Type":"ContainerStarted","Data":"b391a427f5aec737b75d773134851b88c11bb653f27ae28c101fa30d4149a75b"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.534119 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"7d1f02eba871e4e5fe005c8076ae7879a4fafc6fb5e7fb9bbde386f0b69b76ac"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.534162 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a"} Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.559451 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" podStartSLOduration=5.108382089 podStartE2EDuration="5.559405434s" podCreationTimestamp="2026-03-16 00:31:55 +0000 UTC" firstStartedPulling="2026-03-16 00:31:59.783122909 +0000 UTC m=+1528.383221339" lastFinishedPulling="2026-03-16 00:32:00.234146254 +0000 UTC m=+1528.834244684" observedRunningTime="2026-03-16 00:32:00.555048677 +0000 UTC m=+1529.155147117" watchObservedRunningTime="2026-03-16 00:32:00.559405434 +0000 UTC m=+1529.159503864" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.581632 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" podStartSLOduration=4.109260681 podStartE2EDuration="4.581612366s" podCreationTimestamp="2026-03-16 00:31:56 +0000 UTC" firstStartedPulling="2026-03-16 00:31:59.443492556 +0000 UTC m=+1528.043590986" lastFinishedPulling="2026-03-16 00:31:59.915844241 +0000 UTC m=+1528.515942671" observedRunningTime="2026-03-16 00:32:00.57501768 +0000 UTC m=+1529.175116130" watchObservedRunningTime="2026-03-16 00:32:00.581612366 +0000 UTC m=+1529.181710796" Mar 16 00:32:00 crc kubenswrapper[4983]: I0316 00:32:00.649411 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=15.303882348 podStartE2EDuration="33.649390115s" podCreationTimestamp="2026-03-16 00:31:27 +0000 UTC" firstStartedPulling="2026-03-16 00:31:41.243651205 +0000 UTC m=+1509.843749636" lastFinishedPulling="2026-03-16 00:31:59.589158973 +0000 UTC m=+1528.189257403" observedRunningTime="2026-03-16 00:32:00.61997671 +0000 UTC m=+1529.220075140" watchObservedRunningTime="2026-03-16 00:32:00.649390115 +0000 UTC m=+1529.249488545" Mar 16 00:32:01 crc kubenswrapper[4983]: I0316 00:32:01.091743 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:32:01 crc kubenswrapper[4983]: I0316 00:32:01.542612 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerStarted","Data":"1a24252820d3b1761c9e6b612d0d04c82239e2b635f6875b1ac06e6defe006a0"} Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.156813 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.192893 4983 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.574145 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerStarted","Data":"e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd"} Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.585672 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" podStartSLOduration=1.479191047 podStartE2EDuration="2.585658672s" podCreationTimestamp="2026-03-16 00:32:00 +0000 UTC" firstStartedPulling="2026-03-16 00:32:01.082249725 +0000 UTC m=+1529.682348155" lastFinishedPulling="2026-03-16 00:32:02.18871735 +0000 UTC m=+1530.788815780" observedRunningTime="2026-03-16 00:32:02.583921666 +0000 UTC m=+1531.184020106" watchObservedRunningTime="2026-03-16 00:32:02.585658672 +0000 UTC m=+1531.185757102" Mar 16 00:32:02 crc kubenswrapper[4983]: I0316 00:32:02.614192 4983 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 16 00:32:03 crc kubenswrapper[4983]: I0316 00:32:03.580478 4983 generic.go:334] "Generic (PLEG): container finished" podID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerID="e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd" exitCode=0 Mar 16 00:32:03 crc kubenswrapper[4983]: I0316 00:32:03.580545 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerDied","Data":"e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd"} Mar 16 00:32:04 crc kubenswrapper[4983]: I0316 00:32:04.884434 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:04 crc kubenswrapper[4983]: I0316 00:32:04.974296 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") pod \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\" (UID: \"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf\") " Mar 16 00:32:04 crc kubenswrapper[4983]: I0316 00:32:04.981630 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh" (OuterVolumeSpecName: "kube-api-access-c9xfh") pod "69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" (UID: "69be3986-5dfe-49cd-a9c9-8bde7c59eaaf"). InnerVolumeSpecName "kube-api-access-c9xfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.076265 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9xfh\" (UniqueName: \"kubernetes.io/projected/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf-kube-api-access-c9xfh\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.178006 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.186788 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560346-xpbzq"] Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.594663 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" event={"ID":"69be3986-5dfe-49cd-a9c9-8bde7c59eaaf","Type":"ContainerDied","Data":"1a24252820d3b1761c9e6b612d0d04c82239e2b635f6875b1ac06e6defe006a0"} Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.594701 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560352-h2jj5" Mar 16 00:32:05 crc kubenswrapper[4983]: I0316 00:32:05.594710 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a24252820d3b1761c9e6b612d0d04c82239e2b635f6875b1ac06e6defe006a0" Mar 16 00:32:06 crc kubenswrapper[4983]: I0316 00:32:06.102262 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb0fb90-2fa7-4376-8997-678868e0832a" path="/var/lib/kubelet/pods/3cb0fb90-2fa7-4376-8997-678868e0832a/volumes" Mar 16 00:32:06 crc kubenswrapper[4983]: I0316 00:32:06.890866 4983 scope.go:117] "RemoveContainer" containerID="2ff36755f0536b7a806a46f7132557ff7e7da3301af8403a26d90d34c8f6b2e8" Mar 16 00:32:08 crc kubenswrapper[4983]: I0316 00:32:08.735190 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:32:08 crc kubenswrapper[4983]: I0316 00:32:08.735729 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" containerID="cri-o://9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" gracePeriod=30 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.086982 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142376 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142526 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142576 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142639 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142705 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142746 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.142793 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") pod \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\" (UID: \"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3\") " Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.143770 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.147578 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.151957 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.151995 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.158256 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.158282 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv" (OuterVolumeSpecName: "kube-api-access-4wcbv") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "kube-api-access-4wcbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.158381 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" (UID: "b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245272 4983 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245299 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcbv\" (UniqueName: \"kubernetes.io/projected/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-kube-api-access-4wcbv\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245311 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245320 4983 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245329 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245339 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.245348 4983 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.628389 4983 generic.go:334] "Generic (PLEG): container finished" podID="e4d42cf2-f1fd-4aa7-b950-0d60911c50af" containerID="53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.628431 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerDied","Data":"53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.629328 4983 scope.go:117] "RemoveContainer" containerID="53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630641 4983 generic.go:334] "Generic (PLEG): container finished" podID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630705 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerDied","Data":"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630737 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" event={"ID":"b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3","Type":"ContainerDied","Data":"914a76d3eee89820d29491da1b012ee2154bcaedef1aaca3a8fafb3b8f5f37b8"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630775 4983 scope.go:117] "RemoveContainer" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.630916 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p4tqx" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.636839 4983 generic.go:334] "Generic (PLEG): container finished" podID="f90f8a5e-67de-4058-9e42-0caf957b6b71" containerID="0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.636936 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerDied","Data":"0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.637495 4983 scope.go:117] "RemoveContainer" containerID="0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.654161 4983 generic.go:334] "Generic (PLEG): container finished" podID="c6f00393-4848-47e5-8836-3e3b9c3a5b95" containerID="de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.654330 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerDied","Data":"de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.655082 4983 scope.go:117] "RemoveContainer" containerID="de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.664954 4983 scope.go:117] "RemoveContainer" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" Mar 16 00:32:09 crc kubenswrapper[4983]: E0316 00:32:09.666122 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235\": container with ID starting with 9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235 not found: ID does not exist" containerID="9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.666185 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235"} err="failed to get container status \"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235\": rpc error: code = NotFound desc = could not find container \"9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235\": container with ID starting with 9066f64ce687ed8da544065f4a225d32eba44f77dc47f406c4e22bf1dac91235 not found: ID does not exist" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.688259 4983 generic.go:334] "Generic (PLEG): container finished" podID="1b5da06a-5282-43dc-b876-76eb99ba6f9d" containerID="d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.688412 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerDied","Data":"d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.689282 4983 scope.go:117] "RemoveContainer" containerID="d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.693650 4983 generic.go:334] "Generic (PLEG): container finished" podID="c395c954-b7f5-4ec0-be3d-29c8cec19fb1" containerID="9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4" exitCode=0 Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.693691 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerDied","Data":"9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4"} Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.700264 4983 scope.go:117] "RemoveContainer" containerID="9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4" Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.752682 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:32:09 crc kubenswrapper[4983]: I0316 00:32:09.765643 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p4tqx"] Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.101941 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" path="/var/lib/kubelet/pods/b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3/volumes" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612335 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4bn8h"] Mar 16 00:32:10 crc kubenswrapper[4983]: E0316 00:32:10.612664 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612684 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" Mar 16 00:32:10 crc kubenswrapper[4983]: E0316 00:32:10.612699 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerName="oc" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612707 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerName="oc" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612872 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e69fb8-91f6-4bfc-b8a5-2f9e77922ac3" containerName="default-interconnect" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.612894 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" containerName="oc" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.613408 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.617839 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618014 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618116 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618043 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618356 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.618387 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.622085 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-kdvm9" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.624027 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4bn8h"] Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.665892 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666171 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-sasl-users\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666274 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6pw\" (UniqueName: \"kubernetes.io/projected/765c4911-7ef9-4466-9038-70c84c21e731-kube-api-access-ft6pw\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666377 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/765c4911-7ef9-4466-9038-70c84c21e731-sasl-config\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666484 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666518 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.666584 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.703009 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.708887 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.712989 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.729319 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.732332 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15"} Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.768899 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/765c4911-7ef9-4466-9038-70c84c21e731-sasl-config\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.768979 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769020 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769119 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769203 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769323 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-sasl-users\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.769374 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6pw\" (UniqueName: \"kubernetes.io/projected/765c4911-7ef9-4466-9038-70c84c21e731-kube-api-access-ft6pw\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.770154 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/765c4911-7ef9-4466-9038-70c84c21e731-sasl-config\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.778879 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.779461 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.788837 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6pw\" (UniqueName: \"kubernetes.io/projected/765c4911-7ef9-4466-9038-70c84c21e731-kube-api-access-ft6pw\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.821617 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.822956 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-sasl-users\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.824651 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/765c4911-7ef9-4466-9038-70c84c21e731-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-4bn8h\" (UID: \"765c4911-7ef9-4466-9038-70c84c21e731\") " pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:10 crc kubenswrapper[4983]: I0316 00:32:10.937485 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.478601 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-4bn8h"] Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.743595 4983 generic.go:334] "Generic (PLEG): container finished" podID="1b5da06a-5282-43dc-b876-76eb99ba6f9d" containerID="02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.743682 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerDied","Data":"02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.743749 4983 scope.go:117] "RemoveContainer" containerID="d1aead081db4f20ffd00d7a3d89630f4d9a6f1c75b1a2c6f1033ef02a2e687b6" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.744426 4983 scope.go:117] "RemoveContainer" containerID="02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.744689 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82_service-telemetry(1b5da06a-5282-43dc-b876-76eb99ba6f9d)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" podUID="1b5da06a-5282-43dc-b876-76eb99ba6f9d" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.747032 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" event={"ID":"765c4911-7ef9-4466-9038-70c84c21e731","Type":"ContainerStarted","Data":"6b9eaa0bce094cfdcf8bcb97f816da561bc2bdad17e78f44927011bfff8426fe"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.747079 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" event={"ID":"765c4911-7ef9-4466-9038-70c84c21e731","Type":"ContainerStarted","Data":"16b6adc4d9fe3612f3abb3da989439a47527f6a10a92c02041043cc98af2121f"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.752907 4983 generic.go:334] "Generic (PLEG): container finished" podID="c395c954-b7f5-4ec0-be3d-29c8cec19fb1" containerID="9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.753004 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerDied","Data":"9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.754082 4983 scope.go:117] "RemoveContainer" containerID="9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.754388 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-7cd756799d-8ld79_service-telemetry(c395c954-b7f5-4ec0-be3d-29c8cec19fb1)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" podUID="c395c954-b7f5-4ec0-be3d-29c8cec19fb1" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.759343 4983 generic.go:334] "Generic (PLEG): container finished" podID="e4d42cf2-f1fd-4aa7-b950-0d60911c50af" containerID="4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.759424 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerDied","Data":"4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.760098 4983 scope.go:117] "RemoveContainer" containerID="4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.760410 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v_service-telemetry(e4d42cf2-f1fd-4aa7-b950-0d60911c50af)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" podUID="e4d42cf2-f1fd-4aa7-b950-0d60911c50af" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.773195 4983 scope.go:117] "RemoveContainer" containerID="9d60a4524a919c756cb4c09d07a1048e63fe633d565763fc762848b2f27837f4" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.775578 4983 generic.go:334] "Generic (PLEG): container finished" podID="f90f8a5e-67de-4058-9e42-0caf957b6b71" containerID="12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.775681 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerDied","Data":"12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.776519 4983 scope.go:117] "RemoveContainer" containerID="12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.776804 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9_service-telemetry(f90f8a5e-67de-4058-9e42-0caf957b6b71)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" podUID="f90f8a5e-67de-4058-9e42-0caf957b6b71" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.799503 4983 generic.go:334] "Generic (PLEG): container finished" podID="c6f00393-4848-47e5-8836-3e3b9c3a5b95" containerID="5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798" exitCode=0 Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.799576 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerDied","Data":"5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798"} Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.802582 4983 scope.go:117] "RemoveContainer" containerID="5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798" Mar 16 00:32:11 crc kubenswrapper[4983]: E0316 00:32:11.805526 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf_service-telemetry(c6f00393-4848-47e5-8836-3e3b9c3a5b95)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" podUID="c6f00393-4848-47e5-8836-3e3b9c3a5b95" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.829025 4983 scope.go:117] "RemoveContainer" containerID="53f4487b14468a504bb34d307152a174c835e495a8ccf6a08ce9083e42f32a9a" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.847006 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-4bn8h" podStartSLOduration=3.8469824299999997 podStartE2EDuration="3.84698243s" podCreationTimestamp="2026-03-16 00:32:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-16 00:32:11.839015387 +0000 UTC m=+1540.439113827" watchObservedRunningTime="2026-03-16 00:32:11.84698243 +0000 UTC m=+1540.447080860" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.866670 4983 scope.go:117] "RemoveContainer" containerID="0cc7aaee559d458119dc2ee5a3e2cb02a2dfd2bf9ba1ad2ded84686f30c14aa1" Mar 16 00:32:11 crc kubenswrapper[4983]: I0316 00:32:11.953934 4983 scope.go:117] "RemoveContainer" containerID="de0cb00eb368a83860a045f9347c998a8a509933602b7ea228fb491b6a620d63" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.267709 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.269101 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.271146 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.271842 4983 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.280495 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.314279 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rhhn\" (UniqueName: \"kubernetes.io/projected/3a6a664b-b239-4fa8-b423-19c1246c89cd-kube-api-access-6rhhn\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.314383 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/3a6a664b-b239-4fa8-b423-19c1246c89cd-qdr-test-config\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.314443 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/3a6a664b-b239-4fa8-b423-19c1246c89cd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.416505 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rhhn\" (UniqueName: \"kubernetes.io/projected/3a6a664b-b239-4fa8-b423-19c1246c89cd-kube-api-access-6rhhn\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.416629 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/3a6a664b-b239-4fa8-b423-19c1246c89cd-qdr-test-config\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.416670 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/3a6a664b-b239-4fa8-b423-19c1246c89cd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.417665 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/3a6a664b-b239-4fa8-b423-19c1246c89cd-qdr-test-config\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.426701 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/3a6a664b-b239-4fa8-b423-19c1246c89cd-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.433393 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rhhn\" (UniqueName: \"kubernetes.io/projected/3a6a664b-b239-4fa8-b423-19c1246c89cd-kube-api-access-6rhhn\") pod \"qdr-test\" (UID: \"3a6a664b-b239-4fa8-b423-19c1246c89cd\") " pod="service-telemetry/qdr-test" Mar 16 00:32:12 crc kubenswrapper[4983]: I0316 00:32:12.596244 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 16 00:32:13 crc kubenswrapper[4983]: I0316 00:32:13.080830 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 16 00:32:13 crc kubenswrapper[4983]: W0316 00:32:13.085977 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a6a664b_b239_4fa8_b423_19c1246c89cd.slice/crio-71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717 WatchSource:0}: Error finding container 71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717: Status 404 returned error can't find the container with id 71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717 Mar 16 00:32:13 crc kubenswrapper[4983]: I0316 00:32:13.849452 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"3a6a664b-b239-4fa8-b423-19c1246c89cd","Type":"ContainerStarted","Data":"71d44bd285b36d95af3b9fbcbff1be387784c9b12b8545177545ffd2cc53b717"} Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.448804 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.450173 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.933673 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"3a6a664b-b239-4fa8-b423-19c1246c89cd","Type":"ContainerStarted","Data":"911b626681d37633e50876be3127afad71c35e402072c125e8e7e4db145d326a"} Mar 16 00:32:23 crc kubenswrapper[4983]: I0316 00:32:23.949680 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=1.6262936190000001 podStartE2EDuration="11.949657746s" podCreationTimestamp="2026-03-16 00:32:12 +0000 UTC" firstStartedPulling="2026-03-16 00:32:13.089818464 +0000 UTC m=+1541.689916894" lastFinishedPulling="2026-03-16 00:32:23.413182591 +0000 UTC m=+1552.013281021" observedRunningTime="2026-03-16 00:32:23.946457021 +0000 UTC m=+1552.546555461" watchObservedRunningTime="2026-03-16 00:32:23.949657746 +0000 UTC m=+1552.549756196" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.093165 4983 scope.go:117] "RemoveContainer" containerID="4c940ecdbbdce3e920e03b490137756189110f424fca1610e244a105f302f620" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.281387 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-6kz94"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.282407 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.285219 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.285520 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.285619 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.286243 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.286294 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.286419 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.303640 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-6kz94"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400617 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400689 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400712 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400743 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400784 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400809 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.400828 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502781 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502878 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502946 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.502976 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.503023 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.503056 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.503093 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.504785 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.505032 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.505492 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.505721 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.506280 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.506611 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.531448 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"stf-smoketest-smoke1-6kz94\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.614511 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.691658 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.692829 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.701815 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.806594 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"curl\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.907578 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"curl\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.927296 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"curl\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " pod="service-telemetry/curl" Mar 16 00:32:24 crc kubenswrapper[4983]: I0316 00:32:24.943795 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v" event={"ID":"e4d42cf2-f1fd-4aa7-b950-0d60911c50af","Type":"ContainerStarted","Data":"f9375a05af6f467146d3013de939726abd7db3ac804b12efd272b5b6653ea1ff"} Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.008551 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.098156 4983 scope.go:117] "RemoveContainer" containerID="02d6fd17b80381846864343ca49e33edc5379e281f42fd70ea7317cccec5b89a" Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.154150 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-6kz94"] Mar 16 00:32:25 crc kubenswrapper[4983]: W0316 00:32:25.164787 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd507b81c_ccea_4bf1_9f0c_55266c51bc27.slice/crio-a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922 WatchSource:0}: Error finding container a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922: Status 404 returned error can't find the container with id a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922 Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.285717 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.952404 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerStarted","Data":"a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922"} Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.954702 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82" event={"ID":"1b5da06a-5282-43dc-b876-76eb99ba6f9d","Type":"ContainerStarted","Data":"55708cf206dfc8b11c0836b329aadaa86340d09904619d2b4794423920841dc7"} Mar 16 00:32:25 crc kubenswrapper[4983]: I0316 00:32:25.957358 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85","Type":"ContainerStarted","Data":"abb62b80bf96acf3ca0f52f2c57f7d56c28aacf8192a83c8a83908a8b55945c5"} Mar 16 00:32:26 crc kubenswrapper[4983]: I0316 00:32:26.093416 4983 scope.go:117] "RemoveContainer" containerID="5010f633de52a0f3be196b392748208626babeeef07821c69f59d1b0cb4b0798" Mar 16 00:32:26 crc kubenswrapper[4983]: I0316 00:32:26.094093 4983 scope.go:117] "RemoveContainer" containerID="12e3dbf6647554e91a651ecc8cb405dd418f0b287a4bf41cee0f2269408df3b1" Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.092682 4983 scope.go:117] "RemoveContainer" containerID="9f800cb9f5e1de88d76583d5ff674f85b7732ea6e246d9d79d7811e17cd31b15" Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.976824 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7cd756799d-8ld79" event={"ID":"c395c954-b7f5-4ec0-be3d-29c8cec19fb1","Type":"ContainerStarted","Data":"aeabd7899ca69f7686c59db0d40cdd077c528d5939bade6b5b522978c1c199c2"} Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.977992 4983 generic.go:334] "Generic (PLEG): container finished" podID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerID="b6013509887d2e2f4faebc882cfa5c7abb1f33f5f915bce9e2025431e12a7234" exitCode=0 Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.978031 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85","Type":"ContainerDied","Data":"b6013509887d2e2f4faebc882cfa5c7abb1f33f5f915bce9e2025431e12a7234"} Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.981030 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9" event={"ID":"f90f8a5e-67de-4058-9e42-0caf957b6b71","Type":"ContainerStarted","Data":"511bee0fe29e1cdb6f01989df7c4f201112407b0103fda5b1d647405bbaf9609"} Mar 16 00:32:27 crc kubenswrapper[4983]: I0316 00:32:27.983030 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf" event={"ID":"c6f00393-4848-47e5-8836-3e3b9c3a5b95","Type":"ContainerStarted","Data":"df18d3eedeadf2e925899f3ccaaac110cd7077a13a6d2d24c7a34e0f78e95d16"} Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.774285 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.917129 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") pod \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\" (UID: \"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85\") " Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.922417 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx" (OuterVolumeSpecName: "kube-api-access-5wjfx") pod "d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" (UID: "d68c4bdc-1f0d-47dc-ada9-07ec54da7e85"). InnerVolumeSpecName "kube-api-access-5wjfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:32:31 crc kubenswrapper[4983]: I0316 00:32:31.966134 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_d68c4bdc-1f0d-47dc-ada9-07ec54da7e85/curl/0.log" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.011559 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"d68c4bdc-1f0d-47dc-ada9-07ec54da7e85","Type":"ContainerDied","Data":"abb62b80bf96acf3ca0f52f2c57f7d56c28aacf8192a83c8a83908a8b55945c5"} Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.011601 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb62b80bf96acf3ca0f52f2c57f7d56c28aacf8192a83c8a83908a8b55945c5" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.011662 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.019038 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wjfx\" (UniqueName: \"kubernetes.io/projected/d68c4bdc-1f0d-47dc-ada9-07ec54da7e85-kube-api-access-5wjfx\") on node \"crc\" DevicePath \"\"" Mar 16 00:32:32 crc kubenswrapper[4983]: I0316 00:32:32.249039 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4h8qf_1f2a95c7-282e-4200-ac63-1a114726205b/prometheus-webhook-snmp/0.log" Mar 16 00:32:37 crc kubenswrapper[4983]: I0316 00:32:37.048390 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerStarted","Data":"7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f"} Mar 16 00:32:42 crc kubenswrapper[4983]: I0316 00:32:42.111850 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerStarted","Data":"89d73a24a08dcc769de73df504ef03e9eb6f0444dcaf2637db15f3aa6ddfe562"} Mar 16 00:32:42 crc kubenswrapper[4983]: I0316 00:32:42.128734 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-6kz94" podStartSLOduration=1.425926246 podStartE2EDuration="18.128720032s" podCreationTimestamp="2026-03-16 00:32:24 +0000 UTC" firstStartedPulling="2026-03-16 00:32:25.169272219 +0000 UTC m=+1553.769370649" lastFinishedPulling="2026-03-16 00:32:41.872066015 +0000 UTC m=+1570.472164435" observedRunningTime="2026-03-16 00:32:42.128056895 +0000 UTC m=+1570.728155325" watchObservedRunningTime="2026-03-16 00:32:42.128720032 +0000 UTC m=+1570.728818462" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.448475 4983 patch_prober.go:28] interesting pod/machine-config-daemon-7sbnj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449079 4983 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449132 4983 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449785 4983 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0"} pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 16 00:32:53 crc kubenswrapper[4983]: I0316 00:32:53.449854 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerName="machine-config-daemon" containerID="cri-o://10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" gracePeriod=600 Mar 16 00:32:54 crc kubenswrapper[4983]: E0316 00:32:54.152131 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.186985 4983 generic.go:334] "Generic (PLEG): container finished" podID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" exitCode=0 Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.187298 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerDied","Data":"10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0"} Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.187441 4983 scope.go:117] "RemoveContainer" containerID="5ccb6fbab37bc5f699825790ea94936a8112c1dca902322ca17a08342dea1350" Mar 16 00:32:54 crc kubenswrapper[4983]: I0316 00:32:54.187713 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:32:54 crc kubenswrapper[4983]: E0316 00:32:54.188007 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:02 crc kubenswrapper[4983]: I0316 00:33:02.387460 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4h8qf_1f2a95c7-282e-4200-ac63-1a114726205b/prometheus-webhook-snmp/0.log" Mar 16 00:33:08 crc kubenswrapper[4983]: I0316 00:33:08.093124 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:08 crc kubenswrapper[4983]: E0316 00:33:08.093979 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:10 crc kubenswrapper[4983]: I0316 00:33:10.316746 4983 generic.go:334] "Generic (PLEG): container finished" podID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerID="7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f" exitCode=0 Mar 16 00:33:10 crc kubenswrapper[4983]: I0316 00:33:10.316807 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerDied","Data":"7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f"} Mar 16 00:33:10 crc kubenswrapper[4983]: I0316 00:33:10.317377 4983 scope.go:117] "RemoveContainer" containerID="7ba03515b7feec00fc3cc152ded030cc67b02f774e8b7318ac17702be069987f" Mar 16 00:33:14 crc kubenswrapper[4983]: I0316 00:33:14.358163 4983 generic.go:334] "Generic (PLEG): container finished" podID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerID="89d73a24a08dcc769de73df504ef03e9eb6f0444dcaf2637db15f3aa6ddfe562" exitCode=0 Mar 16 00:33:14 crc kubenswrapper[4983]: I0316 00:33:14.358241 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerDied","Data":"89d73a24a08dcc769de73df504ef03e9eb6f0444dcaf2637db15f3aa6ddfe562"} Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.616219 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.716536 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.716670 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717349 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717377 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717397 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717420 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.717518 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") pod \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\" (UID: \"d507b81c-ccea-4bf1-9f0c-55266c51bc27\") " Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.722112 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b" (OuterVolumeSpecName: "kube-api-access-zbv9b") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "kube-api-access-zbv9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.733523 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.734934 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.735185 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.736204 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.738234 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.746335 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "d507b81c-ccea-4bf1-9f0c-55266c51bc27" (UID: "d507b81c-ccea-4bf1-9f0c-55266c51bc27"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819301 4983 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819326 4983 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819334 4983 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819342 4983 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819351 4983 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819359 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbv9b\" (UniqueName: \"kubernetes.io/projected/d507b81c-ccea-4bf1-9f0c-55266c51bc27-kube-api-access-zbv9b\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:15 crc kubenswrapper[4983]: I0316 00:33:15.819367 4983 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/d507b81c-ccea-4bf1-9f0c-55266c51bc27-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 16 00:33:16 crc kubenswrapper[4983]: I0316 00:33:16.375085 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-6kz94" event={"ID":"d507b81c-ccea-4bf1-9f0c-55266c51bc27","Type":"ContainerDied","Data":"a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922"} Mar 16 00:33:16 crc kubenswrapper[4983]: I0316 00:33:16.375478 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a77fd65b4f8a0f40d75d34e2263e5dff4a988a6a5fd9e03c81b3feb36727d922" Mar 16 00:33:16 crc kubenswrapper[4983]: I0316 00:33:16.375182 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-6kz94" Mar 16 00:33:17 crc kubenswrapper[4983]: I0316 00:33:17.586790 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-6kz94_d507b81c-ccea-4bf1-9f0c-55266c51bc27/smoketest-collectd/0.log" Mar 16 00:33:17 crc kubenswrapper[4983]: I0316 00:33:17.830858 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-6kz94_d507b81c-ccea-4bf1-9f0c-55266c51bc27/smoketest-ceilometer/0.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.098868 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-4bn8h_765c4911-7ef9-4466-9038-70c84c21e731/default-interconnect/0.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.350507 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82_1b5da06a-5282-43dc-b876-76eb99ba6f9d/bridge/2.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.599360 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-2tf82_1b5da06a-5282-43dc-b876-76eb99ba6f9d/sg-core/0.log" Mar 16 00:33:18 crc kubenswrapper[4983]: I0316 00:33:18.862950 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7cd756799d-8ld79_c395c954-b7f5-4ec0-be3d-29c8cec19fb1/bridge/2.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.119554 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7cd756799d-8ld79_c395c954-b7f5-4ec0-be3d-29c8cec19fb1/sg-core/0.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.402987 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf_c6f00393-4848-47e5-8836-3e3b9c3a5b95/bridge/2.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.668629 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-lh9kf_c6f00393-4848-47e5-8836-3e3b9c3a5b95/sg-core/0.log" Mar 16 00:33:19 crc kubenswrapper[4983]: I0316 00:33:19.955196 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v_e4d42cf2-f1fd-4aa7-b950-0d60911c50af/bridge/2.log" Mar 16 00:33:20 crc kubenswrapper[4983]: I0316 00:33:20.211089 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-5b76b6fc95-fg62v_e4d42cf2-f1fd-4aa7-b950-0d60911c50af/sg-core/0.log" Mar 16 00:33:20 crc kubenswrapper[4983]: I0316 00:33:20.465146 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9_f90f8a5e-67de-4058-9e42-0caf957b6b71/bridge/2.log" Mar 16 00:33:20 crc kubenswrapper[4983]: I0316 00:33:20.695395 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-f9bg9_f90f8a5e-67de-4058-9e42-0caf957b6b71/sg-core/0.log" Mar 16 00:33:23 crc kubenswrapper[4983]: I0316 00:33:23.092387 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:23 crc kubenswrapper[4983]: E0316 00:33:23.092950 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:24 crc kubenswrapper[4983]: I0316 00:33:24.264501 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-64877956d4-ljbdp_5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d/operator/0.log" Mar 16 00:33:24 crc kubenswrapper[4983]: I0316 00:33:24.549370 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_4c952d0a-6462-4081-8603-935847aefe14/prometheus/0.log" Mar 16 00:33:24 crc kubenswrapper[4983]: I0316 00:33:24.816551 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_56cd8a8b-9f7b-45aa-ad0a-0c84fd70722e/elasticsearch/0.log" Mar 16 00:33:25 crc kubenswrapper[4983]: I0316 00:33:25.104105 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4h8qf_1f2a95c7-282e-4200-ac63-1a114726205b/prometheus-webhook-snmp/0.log" Mar 16 00:33:25 crc kubenswrapper[4983]: I0316 00:33:25.360812 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_f21bd8c3-505c-465a-afeb-404a9136ea58/alertmanager/0.log" Mar 16 00:33:38 crc kubenswrapper[4983]: I0316 00:33:38.092645 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:38 crc kubenswrapper[4983]: E0316 00:33:38.093342 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:33:39 crc kubenswrapper[4983]: I0316 00:33:39.284006 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-65fdb44596-qnp9k_b8f4edcf-0403-4d59-b045-e618c6aabff5/operator/0.log" Mar 16 00:33:42 crc kubenswrapper[4983]: I0316 00:33:42.521900 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-64877956d4-ljbdp_5c3140b4-67ae-4012-bd8b-9cecbcb4ff4d/operator/0.log" Mar 16 00:33:42 crc kubenswrapper[4983]: I0316 00:33:42.799127 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_3a6a664b-b239-4fa8-b423-19c1246c89cd/qdr/0.log" Mar 16 00:33:52 crc kubenswrapper[4983]: I0316 00:33:52.100540 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:33:52 crc kubenswrapper[4983]: E0316 00:33:52.101420 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159007 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560354-x6s7n"] Mar 16 00:34:00 crc kubenswrapper[4983]: E0316 00:34:00.159895 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-collectd" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159912 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-collectd" Mar 16 00:34:00 crc kubenswrapper[4983]: E0316 00:34:00.159926 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerName="curl" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159945 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerName="curl" Mar 16 00:34:00 crc kubenswrapper[4983]: E0316 00:34:00.159958 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-ceilometer" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.159967 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-ceilometer" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.162360 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-ceilometer" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.162482 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68c4bdc-1f0d-47dc-ada9-07ec54da7e85" containerName="curl" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.162583 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="d507b81c-ccea-4bf1-9f0c-55266c51bc27" containerName="smoketest-collectd" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.163305 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166314 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166165 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166836 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-x6s7n"] Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.166893 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.190528 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"auto-csr-approver-29560354-x6s7n\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.291810 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"auto-csr-approver-29560354-x6s7n\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.312544 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"auto-csr-approver-29560354-x6s7n\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.488595 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.881327 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560354-x6s7n"] Mar 16 00:34:00 crc kubenswrapper[4983]: I0316 00:34:00.895402 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:34:01 crc kubenswrapper[4983]: I0316 00:34:01.735721 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" event={"ID":"34558a42-5623-4304-982a-7eb50d175b2d","Type":"ContainerStarted","Data":"b96276156b81bf2f529428d5a801e008a094c9f73f65bbc568358821bf0d4c60"} Mar 16 00:34:02 crc kubenswrapper[4983]: I0316 00:34:02.745312 4983 generic.go:334] "Generic (PLEG): container finished" podID="34558a42-5623-4304-982a-7eb50d175b2d" containerID="7cbb072030f4c1287101d25e5a82e3b25ae4dbd2cfe9ee92a6c85b6bbd3dd66e" exitCode=0 Mar 16 00:34:02 crc kubenswrapper[4983]: I0316 00:34:02.745501 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" event={"ID":"34558a42-5623-4304-982a-7eb50d175b2d","Type":"ContainerDied","Data":"7cbb072030f4c1287101d25e5a82e3b25ae4dbd2cfe9ee92a6c85b6bbd3dd66e"} Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.015562 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.092382 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:04 crc kubenswrapper[4983]: E0316 00:34:04.092692 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.174718 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") pod \"34558a42-5623-4304-982a-7eb50d175b2d\" (UID: \"34558a42-5623-4304-982a-7eb50d175b2d\") " Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.179780 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n" (OuterVolumeSpecName: "kube-api-access-m2d7n") pod "34558a42-5623-4304-982a-7eb50d175b2d" (UID: "34558a42-5623-4304-982a-7eb50d175b2d"). InnerVolumeSpecName "kube-api-access-m2d7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.276360 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2d7n\" (UniqueName: \"kubernetes.io/projected/34558a42-5623-4304-982a-7eb50d175b2d-kube-api-access-m2d7n\") on node \"crc\" DevicePath \"\"" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.764467 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" event={"ID":"34558a42-5623-4304-982a-7eb50d175b2d","Type":"ContainerDied","Data":"b96276156b81bf2f529428d5a801e008a094c9f73f65bbc568358821bf0d4c60"} Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.764794 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96276156b81bf2f529428d5a801e008a094c9f73f65bbc568358821bf0d4c60" Mar 16 00:34:04 crc kubenswrapper[4983]: I0316 00:34:04.764513 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560354-x6s7n" Mar 16 00:34:05 crc kubenswrapper[4983]: I0316 00:34:05.077639 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:34:05 crc kubenswrapper[4983]: I0316 00:34:05.084733 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560348-bfz7x"] Mar 16 00:34:06 crc kubenswrapper[4983]: I0316 00:34:06.104441 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee9b49cf-d10f-4047-aa75-b89a01652d64" path="/var/lib/kubelet/pods/ee9b49cf-d10f-4047-aa75-b89a01652d64/volumes" Mar 16 00:34:06 crc kubenswrapper[4983]: I0316 00:34:06.982548 4983 scope.go:117] "RemoveContainer" containerID="e7a4f48409b5b7c54ae90135f67eb66bb8944343dc341edc9f4aefa5c3b5777f" Mar 16 00:34:07 crc kubenswrapper[4983]: I0316 00:34:07.354138 4983 scope.go:117] "RemoveContainer" containerID="78e4a4ba6d978cbff10ef4b981fae6b3ca9ed9e86332628a7ba7f28e90fd7ea7" Mar 16 00:34:07 crc kubenswrapper[4983]: I0316 00:34:07.393443 4983 scope.go:117] "RemoveContainer" containerID="591b456456f261ac0f2b9814ab85dfd85135db5fb073dd674dfeacb880c9917a" Mar 16 00:34:16 crc kubenswrapper[4983]: I0316 00:34:16.093921 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:16 crc kubenswrapper[4983]: E0316 00:34:16.095364 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.364956 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:34:17 crc kubenswrapper[4983]: E0316 00:34:17.366269 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34558a42-5623-4304-982a-7eb50d175b2d" containerName="oc" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.366426 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="34558a42-5623-4304-982a-7eb50d175b2d" containerName="oc" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.366749 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="34558a42-5623-4304-982a-7eb50d175b2d" containerName="oc" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.372009 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.376561 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-ldkzc"/"default-dockercfg-75pn8" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.376847 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ldkzc"/"openshift-service-ca.crt" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.377130 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-ldkzc"/"kube-root-ca.crt" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.394475 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.462791 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.462871 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.564613 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.564695 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.565118 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.593373 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"must-gather-dnctp\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:17 crc kubenswrapper[4983]: I0316 00:34:17.691634 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:34:18 crc kubenswrapper[4983]: I0316 00:34:18.136829 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:34:18 crc kubenswrapper[4983]: I0316 00:34:18.884371 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerStarted","Data":"30df828c4c51d9f6aeee032cd0b5e15fea4fb9fd1e7c5083f2b3af2a39f13cc5"} Mar 16 00:34:24 crc kubenswrapper[4983]: I0316 00:34:24.942530 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerStarted","Data":"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2"} Mar 16 00:34:24 crc kubenswrapper[4983]: I0316 00:34:24.943018 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerStarted","Data":"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b"} Mar 16 00:34:24 crc kubenswrapper[4983]: I0316 00:34:24.961558 4983 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ldkzc/must-gather-dnctp" podStartSLOduration=1.616939627 podStartE2EDuration="7.961540845s" podCreationTimestamp="2026-03-16 00:34:17 +0000 UTC" firstStartedPulling="2026-03-16 00:34:18.148472876 +0000 UTC m=+1666.748571306" lastFinishedPulling="2026-03-16 00:34:24.493074094 +0000 UTC m=+1673.093172524" observedRunningTime="2026-03-16 00:34:24.953968433 +0000 UTC m=+1673.554066863" watchObservedRunningTime="2026-03-16 00:34:24.961540845 +0000 UTC m=+1673.561639265" Mar 16 00:34:30 crc kubenswrapper[4983]: I0316 00:34:30.092204 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:30 crc kubenswrapper[4983]: E0316 00:34:30.092733 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:43 crc kubenswrapper[4983]: I0316 00:34:43.092916 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:43 crc kubenswrapper[4983]: E0316 00:34:43.093614 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:34:54 crc kubenswrapper[4983]: I0316 00:34:54.092309 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:34:54 crc kubenswrapper[4983]: E0316 00:34:54.093104 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.093371 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:07 crc kubenswrapper[4983]: E0316 00:35:07.094096 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.459089 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5pqgr_bb33b891-4cdb-4fc1-95e4-2895f40fdb7a/control-plane-machine-set-operator/0.log" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.607483 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t4lj8_6b0e4e23-a158-4597-b005-db088a652ec8/kube-rbac-proxy/0.log" Mar 16 00:35:07 crc kubenswrapper[4983]: I0316 00:35:07.639489 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-t4lj8_6b0e4e23-a158-4597-b005-db088a652ec8/machine-api-operator/0.log" Mar 16 00:35:18 crc kubenswrapper[4983]: I0316 00:35:18.992505 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-cwdn9_95208db3-d53d-43c0-9b2c-cc4c5b3236d8/cert-manager-controller/0.log" Mar 16 00:35:19 crc kubenswrapper[4983]: I0316 00:35:19.124714 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-g9j58_4c66c255-b5f4-4c72-8902-7225df93821d/cert-manager-cainjector/0.log" Mar 16 00:35:19 crc kubenswrapper[4983]: I0316 00:35:19.179677 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-jbjkj_1cb1bc27-146d-4df6-9e00-7e0cfb7f28ef/cert-manager-webhook/0.log" Mar 16 00:35:22 crc kubenswrapper[4983]: I0316 00:35:22.096257 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:22 crc kubenswrapper[4983]: E0316 00:35:22.097021 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:32 crc kubenswrapper[4983]: I0316 00:35:32.988611 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sn6x8_2af5ec54-bcc4-45f5-839a-135da91513a2/prometheus-operator/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.092609 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-kx26j_30d188b9-ab98-47a3-8143-3f58ae611dd6/prometheus-operator-admission-webhook/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.223053 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd_7e065fa9-405e-452b-bfe7-c4920a8577db/prometheus-operator-admission-webhook/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.284169 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-c99mb_05523d68-53d9-4cc5-a02b-5221a2396606/operator/0.log" Mar 16 00:35:33 crc kubenswrapper[4983]: I0316 00:35:33.428786 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dmdpt_8eb6b056-16ea-46db-b8ea-fd17a717a8e4/perses-operator/0.log" Mar 16 00:35:35 crc kubenswrapper[4983]: I0316 00:35:35.092788 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:35 crc kubenswrapper[4983]: E0316 00:35:35.093186 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.308362 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.460504 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.478041 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.502872 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.614950 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.626172 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.700810 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fxgfp4_48256dd4-332f-4a25-a535-4357e3b8eccb/extract/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.793680 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/util/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.994563 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/pull/0.log" Mar 16 00:35:46 crc kubenswrapper[4983]: I0316 00:35:46.994616 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.005055 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.140358 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/extract/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.153211 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.154088 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39etxf8x_8092d7d9-1bb8-44ce-bad9-4f36ba75b349/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.322019 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.466393 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.500022 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.531998 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.664699 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/util/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.689013 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/extract/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.717098 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5msqkm_cd45ab45-645e-45d3-a9eb-a3d1392b5f7a/pull/0.log" Mar 16 00:35:47 crc kubenswrapper[4983]: I0316 00:35:47.837420 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/util/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.029383 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/pull/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.042831 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/util/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.047837 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/pull/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.092744 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:35:48 crc kubenswrapper[4983]: E0316 00:35:48.092992 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.202744 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/util/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.241414 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/pull/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.243601 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f084ndg7_d4e5d5e8-e64e-4876-a604-976485b93449/extract/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.399669 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.529897 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.560823 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-content/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.574809 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-content/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.713834 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.723252 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/extract-content/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.945332 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-utilities/0.log" Mar 16 00:35:48 crc kubenswrapper[4983]: I0316 00:35:48.955842 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rxmlr_b4e15b89-9659-49da-bccb-c826ebceeb93/registry-server/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.066454 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.082515 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.132996 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.314012 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.320056 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.505276 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-pvjtd_46c9f8c6-7d08-47e7-866d-7f359e8683be/marketplace-operator/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.558386 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-95rsh_4deeaa90-9b0b-47cb-a8bf-4b2524a736a8/registry-server/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.633549 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.776239 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.795464 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.824366 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-content/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.946809 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-utilities/0.log" Mar 16 00:35:49 crc kubenswrapper[4983]: I0316 00:35:49.954248 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/extract-content/0.log" Mar 16 00:35:50 crc kubenswrapper[4983]: I0316 00:35:50.154319 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hjdk_628d0b6e-5772-4af2-aa28-28cc15bd5d60/registry-server/0.log" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.092451 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:00 crc kubenswrapper[4983]: E0316 00:36:00.093308 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.155463 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560356-67f7r"] Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.156462 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.158936 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.159131 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.160277 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.160400 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-67f7r"] Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.202729 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"auto-csr-approver-29560356-67f7r\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.304118 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"auto-csr-approver-29560356-67f7r\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.322208 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"auto-csr-approver-29560356-67f7r\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.471318 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:00 crc kubenswrapper[4983]: I0316 00:36:00.937301 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560356-67f7r"] Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.079652 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-sn6x8_2af5ec54-bcc4-45f5-839a-135da91513a2/prometheus-operator/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.125854 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-zcpfd_7e065fa9-405e-452b-bfe7-c4920a8577db/prometheus-operator-admission-webhook/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.149012 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7c86566d45-kx26j_30d188b9-ab98-47a3-8143-3f58ae611dd6/prometheus-operator-admission-webhook/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.257375 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-c99mb_05523d68-53d9-4cc5-a02b-5221a2396606/operator/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.258007 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-dmdpt_8eb6b056-16ea-46db-b8ea-fd17a717a8e4/perses-operator/0.log" Mar 16 00:36:01 crc kubenswrapper[4983]: I0316 00:36:01.625202 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-67f7r" event={"ID":"e68b39b9-3d4f-4927-86de-6ee746b6165c","Type":"ContainerStarted","Data":"68cb14a2ffacefe11dbdef620c798b43906d8e7b2d884334faa869500824a5a5"} Mar 16 00:36:02 crc kubenswrapper[4983]: I0316 00:36:02.632468 4983 generic.go:334] "Generic (PLEG): container finished" podID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerID="577063654d19910ce0b39d5b8b0ea00c99859a562caea958335aa79b85b42df2" exitCode=0 Mar 16 00:36:02 crc kubenswrapper[4983]: I0316 00:36:02.632522 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-67f7r" event={"ID":"e68b39b9-3d4f-4927-86de-6ee746b6165c","Type":"ContainerDied","Data":"577063654d19910ce0b39d5b8b0ea00c99859a562caea958335aa79b85b42df2"} Mar 16 00:36:03 crc kubenswrapper[4983]: I0316 00:36:03.880822 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:03 crc kubenswrapper[4983]: I0316 00:36:03.957715 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") pod \"e68b39b9-3d4f-4927-86de-6ee746b6165c\" (UID: \"e68b39b9-3d4f-4927-86de-6ee746b6165c\") " Mar 16 00:36:03 crc kubenswrapper[4983]: I0316 00:36:03.971948 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z" (OuterVolumeSpecName: "kube-api-access-lg26z") pod "e68b39b9-3d4f-4927-86de-6ee746b6165c" (UID: "e68b39b9-3d4f-4927-86de-6ee746b6165c"). InnerVolumeSpecName "kube-api-access-lg26z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.059730 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg26z\" (UniqueName: \"kubernetes.io/projected/e68b39b9-3d4f-4927-86de-6ee746b6165c-kube-api-access-lg26z\") on node \"crc\" DevicePath \"\"" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.646802 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560356-67f7r" event={"ID":"e68b39b9-3d4f-4927-86de-6ee746b6165c","Type":"ContainerDied","Data":"68cb14a2ffacefe11dbdef620c798b43906d8e7b2d884334faa869500824a5a5"} Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.646839 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68cb14a2ffacefe11dbdef620c798b43906d8e7b2d884334faa869500824a5a5" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.646890 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560356-67f7r" Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.957742 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:36:04 crc kubenswrapper[4983]: I0316 00:36:04.966943 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560350-spjzd"] Mar 16 00:36:06 crc kubenswrapper[4983]: I0316 00:36:06.101532 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57c14e51-5c0b-467c-ba79-ac6f39239445" path="/var/lib/kubelet/pods/57c14e51-5c0b-467c-ba79-ac6f39239445/volumes" Mar 16 00:36:07 crc kubenswrapper[4983]: I0316 00:36:07.473251 4983 scope.go:117] "RemoveContainer" containerID="1accee3cbf491221453bc50f3bfc4fc15297e3a876200e21860d5dd4e3e66686" Mar 16 00:36:15 crc kubenswrapper[4983]: I0316 00:36:15.092251 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:15 crc kubenswrapper[4983]: E0316 00:36:15.093087 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:30 crc kubenswrapper[4983]: I0316 00:36:30.093980 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:30 crc kubenswrapper[4983]: E0316 00:36:30.094719 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:44 crc kubenswrapper[4983]: I0316 00:36:44.092808 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:44 crc kubenswrapper[4983]: E0316 00:36:44.093632 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:36:53 crc kubenswrapper[4983]: I0316 00:36:53.332599 4983 generic.go:334] "Generic (PLEG): container finished" podID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" exitCode=0 Mar 16 00:36:53 crc kubenswrapper[4983]: I0316 00:36:53.332700 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ldkzc/must-gather-dnctp" event={"ID":"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736","Type":"ContainerDied","Data":"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b"} Mar 16 00:36:53 crc kubenswrapper[4983]: I0316 00:36:53.337905 4983 scope.go:117] "RemoveContainer" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:36:54 crc kubenswrapper[4983]: I0316 00:36:54.203560 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldkzc_must-gather-dnctp_dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/gather/0.log" Mar 16 00:36:55 crc kubenswrapper[4983]: I0316 00:36:55.094229 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:36:55 crc kubenswrapper[4983]: E0316 00:36:55.094859 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:00 crc kubenswrapper[4983]: I0316 00:37:00.949551 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:37:00 crc kubenswrapper[4983]: I0316 00:37:00.950232 4983 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-ldkzc/must-gather-dnctp" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" containerID="cri-o://be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" gracePeriod=2 Mar 16 00:37:00 crc kubenswrapper[4983]: I0316 00:37:00.955510 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-ldkzc/must-gather-dnctp"] Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.295902 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldkzc_must-gather-dnctp_dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/copy/0.log" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.296948 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.314300 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") pod \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.314435 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") pod \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\" (UID: \"dfbc5d99-14f4-4c00-bb4e-a20b27fbd736\") " Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.322418 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc" (OuterVolumeSpecName: "kube-api-access-p2qxc") pod "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" (UID: "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736"). InnerVolumeSpecName "kube-api-access-p2qxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.394132 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" (UID: "dfbc5d99-14f4-4c00-bb4e-a20b27fbd736"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.398656 4983 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-ldkzc_must-gather-dnctp_dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/copy/0.log" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.400284 4983 generic.go:334] "Generic (PLEG): container finished" podID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" exitCode=143 Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.400347 4983 scope.go:117] "RemoveContainer" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.406912 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ldkzc/must-gather-dnctp" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.417456 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2qxc\" (UniqueName: \"kubernetes.io/projected/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-kube-api-access-p2qxc\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.417491 4983 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.433626 4983 scope.go:117] "RemoveContainer" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.490514 4983 scope.go:117] "RemoveContainer" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" Mar 16 00:37:01 crc kubenswrapper[4983]: E0316 00:37:01.490998 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2\": container with ID starting with be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2 not found: ID does not exist" containerID="be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.491098 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2"} err="failed to get container status \"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2\": rpc error: code = NotFound desc = could not find container \"be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2\": container with ID starting with be4860e8788fb60fc178d869db537505163f11e21e1c83039e209816f08eacd2 not found: ID does not exist" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.491192 4983 scope.go:117] "RemoveContainer" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:37:01 crc kubenswrapper[4983]: E0316 00:37:01.491669 4983 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b\": container with ID starting with 186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b not found: ID does not exist" containerID="186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b" Mar 16 00:37:01 crc kubenswrapper[4983]: I0316 00:37:01.491714 4983 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b"} err="failed to get container status \"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b\": rpc error: code = NotFound desc = could not find container \"186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b\": container with ID starting with 186933ab1ee4140bd8cc6ab685ffaa66843120b8a8aa7e2e2f1e04a10499c64b not found: ID does not exist" Mar 16 00:37:02 crc kubenswrapper[4983]: I0316 00:37:02.100659 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" path="/var/lib/kubelet/pods/dfbc5d99-14f4-4c00-bb4e-a20b27fbd736/volumes" Mar 16 00:37:10 crc kubenswrapper[4983]: I0316 00:37:10.093397 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:10 crc kubenswrapper[4983]: E0316 00:37:10.094602 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:24 crc kubenswrapper[4983]: I0316 00:37:24.092688 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:24 crc kubenswrapper[4983]: E0316 00:37:24.093438 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:37 crc kubenswrapper[4983]: I0316 00:37:37.092718 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:37 crc kubenswrapper[4983]: E0316 00:37:37.093610 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:37:50 crc kubenswrapper[4983]: I0316 00:37:50.093221 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:37:50 crc kubenswrapper[4983]: E0316 00:37:50.094052 4983 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-7sbnj_openshift-machine-config-operator(48a48757-a3b8-4d4d-92ba-6a2459a26a86)\"" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" podUID="48a48757-a3b8-4d4d-92ba-6a2459a26a86" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.156408 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560358-ns9g8"] Mar 16 00:38:00 crc kubenswrapper[4983]: E0316 00:38:00.157422 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="gather" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157444 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="gather" Mar 16 00:38:00 crc kubenswrapper[4983]: E0316 00:38:00.157464 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerName="oc" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157477 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerName="oc" Mar 16 00:38:00 crc kubenswrapper[4983]: E0316 00:38:00.157507 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157519 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157722 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="e68b39b9-3d4f-4927-86de-6ee746b6165c" containerName="oc" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157748 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="gather" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.157795 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfbc5d99-14f4-4c00-bb4e-a20b27fbd736" containerName="copy" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.158475 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.160721 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.162924 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.171607 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.177067 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-ns9g8"] Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.296039 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"auto-csr-approver-29560358-ns9g8\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.398164 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"auto-csr-approver-29560358-ns9g8\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.421430 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"auto-csr-approver-29560358-ns9g8\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:00 crc kubenswrapper[4983]: I0316 00:38:00.497552 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:01 crc kubenswrapper[4983]: I0316 00:38:01.010699 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560358-ns9g8"] Mar 16 00:38:01 crc kubenswrapper[4983]: W0316 00:38:01.013313 4983 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5014785_5ff5_4e7a_9347_12767e48ce8b.slice/crio-52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031 WatchSource:0}: Error finding container 52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031: Status 404 returned error can't find the container with id 52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031 Mar 16 00:38:01 crc kubenswrapper[4983]: I0316 00:38:01.898357 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" event={"ID":"b5014785-5ff5-4e7a-9347-12767e48ce8b","Type":"ContainerStarted","Data":"52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031"} Mar 16 00:38:02 crc kubenswrapper[4983]: I0316 00:38:02.912178 4983 generic.go:334] "Generic (PLEG): container finished" podID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerID="20046d3f82f54e83f8b5bad0847dd44a1c63394206a4e16905a8f088a3bee614" exitCode=0 Mar 16 00:38:02 crc kubenswrapper[4983]: I0316 00:38:02.912305 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" event={"ID":"b5014785-5ff5-4e7a-9347-12767e48ce8b","Type":"ContainerDied","Data":"20046d3f82f54e83f8b5bad0847dd44a1c63394206a4e16905a8f088a3bee614"} Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.095178 4983 scope.go:117] "RemoveContainer" containerID="10e0124bd7c9379207922afe04fce5d20f300fc291123607458db10ead568fc0" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.233830 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.360173 4983 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") pod \"b5014785-5ff5-4e7a-9347-12767e48ce8b\" (UID: \"b5014785-5ff5-4e7a-9347-12767e48ce8b\") " Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.367652 4983 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm" (OuterVolumeSpecName: "kube-api-access-26knm") pod "b5014785-5ff5-4e7a-9347-12767e48ce8b" (UID: "b5014785-5ff5-4e7a-9347-12767e48ce8b"). InnerVolumeSpecName "kube-api-access-26knm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.461500 4983 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26knm\" (UniqueName: \"kubernetes.io/projected/b5014785-5ff5-4e7a-9347-12767e48ce8b-kube-api-access-26knm\") on node \"crc\" DevicePath \"\"" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.930621 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-7sbnj" event={"ID":"48a48757-a3b8-4d4d-92ba-6a2459a26a86","Type":"ContainerStarted","Data":"a6c7ce0a8fa5ee2f9b1b96037cb7d6454f8ffd8ed071ec717005e5711d4eceb0"} Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.932859 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" event={"ID":"b5014785-5ff5-4e7a-9347-12767e48ce8b","Type":"ContainerDied","Data":"52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031"} Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.932913 4983 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52fa8c55f542cae13d5898de6d1213791a4b112a008bd9c51f526d82b48f6031" Mar 16 00:38:04 crc kubenswrapper[4983]: I0316 00:38:04.932953 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560358-ns9g8" Mar 16 00:38:05 crc kubenswrapper[4983]: I0316 00:38:05.297290 4983 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:38:05 crc kubenswrapper[4983]: I0316 00:38:05.304215 4983 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29560352-h2jj5"] Mar 16 00:38:06 crc kubenswrapper[4983]: I0316 00:38:06.113186 4983 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69be3986-5dfe-49cd-a9c9-8bde7c59eaaf" path="/var/lib/kubelet/pods/69be3986-5dfe-49cd-a9c9-8bde7c59eaaf/volumes" Mar 16 00:38:07 crc kubenswrapper[4983]: I0316 00:38:07.583298 4983 scope.go:117] "RemoveContainer" containerID="e9ad9d465cd26c47636106767bbd93622a4df6a39436eee8b4a72f1c036d34fd" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.172256 4983 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29560360-b78zr"] Mar 16 00:40:00 crc kubenswrapper[4983]: E0316 00:40:00.175509 4983 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerName="oc" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.175562 4983 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerName="oc" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.176024 4983 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5014785-5ff5-4e7a-9347-12767e48ce8b" containerName="oc" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.176938 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.180797 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.180933 4983 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-2mspm" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.181167 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-b78zr"] Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.186659 4983 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.241692 4983 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8l5\" (UniqueName: \"kubernetes.io/projected/46434756-a999-477f-9c06-3727a0752b16-kube-api-access-4s8l5\") pod \"auto-csr-approver-29560360-b78zr\" (UID: \"46434756-a999-477f-9c06-3727a0752b16\") " pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.343582 4983 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8l5\" (UniqueName: \"kubernetes.io/projected/46434756-a999-477f-9c06-3727a0752b16-kube-api-access-4s8l5\") pod \"auto-csr-approver-29560360-b78zr\" (UID: \"46434756-a999-477f-9c06-3727a0752b16\") " pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.374710 4983 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8l5\" (UniqueName: \"kubernetes.io/projected/46434756-a999-477f-9c06-3727a0752b16-kube-api-access-4s8l5\") pod \"auto-csr-approver-29560360-b78zr\" (UID: \"46434756-a999-477f-9c06-3727a0752b16\") " pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.509323 4983 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-b78zr" Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.960019 4983 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29560360-b78zr"] Mar 16 00:40:00 crc kubenswrapper[4983]: I0316 00:40:00.970583 4983 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 16 00:40:01 crc kubenswrapper[4983]: I0316 00:40:01.967846 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-b78zr" event={"ID":"46434756-a999-477f-9c06-3727a0752b16","Type":"ContainerStarted","Data":"3aa6fb62136fc0f873740625504cd1fdc9036f62386a42240131bfbed472f880"} Mar 16 00:40:02 crc kubenswrapper[4983]: I0316 00:40:02.974526 4983 generic.go:334] "Generic (PLEG): container finished" podID="46434756-a999-477f-9c06-3727a0752b16" containerID="31cbe57ea280fc680a823a7db247b537ecaa54df845636c69f7270c931a9a663" exitCode=0 Mar 16 00:40:02 crc kubenswrapper[4983]: I0316 00:40:02.974592 4983 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29560360-b78zr" event={"ID":"46434756-a999-477f-9c06-3727a0752b16","Type":"ContainerDied","Data":"31cbe57ea280fc680a823a7db247b537ecaa54df845636c69f7270c931a9a663"} Mar 16 00:40:04 crc kubenswrapper[4983]: I0316 00:40:04.325729 4983 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29560360-b78zr" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515155650555024461 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015155650556017377 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015155644253016517 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015155644254015470 5ustar corecore